CSV Upload into Oracle Table
The first thing you need is a page with a file type item, and a submit button. When you submit the page, the selected file is uploaded into the apex_application_files (or wwv_flow_files) view. The actual blob contents of the file is uploaded into the column BLOB_CONTENT but since a CSV file is just plain text, it made sense to me to convert that file to a CLOB, so that you can perform regular string functions on the data. The first reference to converting a blob to a clob, was in the OTN forums - confirmed by referring to the oracle documentation. After converting it to a clob, it is simply a matter of looping over each row of the data, and then populating into an apex collection (or some other temp table). This is an intermediary step just so you can review the data before committing the changes. Finally, you just move the data from the collection into the table.
So the code for copying the blob into a clob and populating the collection can be seen below.
To avoid any issues incase someone tries to upload a bogus csv file, I have made sure there are only 2 elements in the row, before adding them to the collection. Of course, there could be a lot more in-depth error checking, but this is the basics and I feel it will do for the time being. Once I have reviewed what's in the collection, and I am sure I want to upload that data, I can then go on to insert the data into my table.
There is a caveat - if you expect there to be a large volume of rows, I would avoid using the collections framework, but rather your own temp table, as collections are too in-efficient with large volumes of data.
So the code for copying the blob into a clob and populating the collection can be seen below.
declare
--variables to do with the copying the blob into a clob
v_blob BLOB;
v_clob CLOB;
v_dest_offset integer := 1;
v_src_offset integer := 1;
v_lang_context integer := dbms_lob.default_lang_ctx;
v_warning integer;
--variables to do with iterating over each row of the clob
v_new_line_pos NUMBER;
v_start_pos NUMBER := 1;
v_current_line varchar2(4000);
v_total_len NUMBER;
v_curr_row apex_application_global.vc_arr2;
--constants
c_collection_name constant varchar2(20) := 'COURSE_UNIT_MAPPING';
begin
select blob_content into v_blob
from apex_application_files
where name = :P30_FILE;
dbms_lob.createtemporary(v_clob, TRUE);
dbms_lob.converttoclob(
dest_lob => v_clob,
src_blob => v_blob,
amount => dbms_lob.lobmaxsize,
dest_offset => v_dest_offset,
src_offset => v_src_offset,
blob_csid => dbms_lob.default_csid,
lang_context => v_lang_context,
warning => v_warning );
APEX_COLLECTION.CREATE_OR_TRUNCATE_COLLECTION(p_collection_name => c_collection_name);
v_total_len := dbms_lob.getlength(v_clob);
while (v_start_pos <= v_total_len) LOOP
v_new_line_pos := instr(v_clob, chr(10), v_start_pos);
IF v_new_line_pos = 0 THEN
v_new_line_pos := v_total_len+1;
END IF;
v_current_line := substr(v_clob, v_start_pos, v_new_line_pos-v_start_pos);
v_curr_row := apex_util.string_to_table(v_current_line, ',');
if v_curr_row.COUNT = 2 THEN --dont want to touch it if it doesn't have exactly 2 elements
APEX_COLLECTION.ADD_MEMBER(
p_collection_name => c_collection_name,
p_c001 => v_curr_row(1),
p_c002 => v_curr_row(2));
END IF;
v_start_pos := v_new_line_pos+1;--need to add one so it doesn't search the same range
END LOOP;
delete from apex_application_files where name = :P30_BULK_FILE;
commit;--I couldn't see the point in keeping this file around since all the data has now been populated into the collection
end;
To avoid any issues incase someone tries to upload a bogus csv file, I have made sure there are only 2 elements in the row, before adding them to the collection. Of course, there could be a lot more in-depth error checking, but this is the basics and I feel it will do for the time being. Once I have reviewed what's in the collection, and I am sure I want to upload that data, I can then go on to insert the data into my table.
insert into type_range (type, range) select c001 type, c002 range from apex_collections where collection_name = c_collection_name;
There is a caveat - if you expect there to be a large volume of rows, I would avoid using the collections framework, but rather your own temp table, as collections are too in-efficient with large volumes of data.