Load data into table from .csv file
Suppose, You have a excel file (login_match.xls) with 338 rows and three columns:
1.Last Name
2.First Name
3.Title / Position
Step-1: You have to make a login_match.csv from that excel.
Step-2: Create a table in terminus (prod7/thword7):
This table also contains three column because the excel contains three columns. And the data type will be same as the excel file.
CREATE TABLE temp_login
(first_name varchar2(64),
last_name varchar2(64),
title varchar2(64)
);
Step-3: Write a control file. (login_match.ctl)
Step-4: copy .csv and .ctl from home to terminus
scp login_match.ctl login_match.csv oracle@terminus:
password:
Step-5: In terminus-wiki
bash-3.00$ sqlldr USERID=prod7 control=login_match.ctl
--------------------done----------------------------
Control File structure:
LOAD DATA
infile 'login_match.csv' "str '\n'"
INTO TABLE temp_login
fields terminated by ',' optionally enclosed by '"'
(
first_name CHAR,
last_name CHAR,
title CHAR
)
Possible errors are:
- ORA-1401: inserted value too large for column.
- Field in data file exceeds maximum length.
Related: http://www.orafaq.com/wiki/SQL*Loader_FAQ
Interesting: http://neowiki.neooffice.org/index.php/Using_Find_and_Replace