Hello all

Hello all,

Has anyone any idea how to upload a really big file (750 MB) in a BLOB field using FireDAC?

1. If I use "LoadFromFile" method for the parameter - the result is "Out of memory";
2. If I use "LoadFromStream" method for the parameter  - the result is "Out of memory";
2. If I use the "BLOB streaming (External streams)" - my program "freezes" during the execution of "ExecSQL". Is there a useful method for showing progres of the process?
Example:
q_InsertFile.ParamByName('FILE_DATA').AsStream := TFileStream.Create(OpenDialog1.FileName, fmOpenReadWrite);
q_InsertFile.ExecSQL;


In Embarcadero community I found a comments about "Out of memory" and...next FireDac version and also about "...workaround is possible...":
http://community.embarcadero.com/index.php/answers/uploading-large-blobs-to-sql-server-varbinary-max-field-does-firedac-help-memory-management

But the link there - it is no longer available:
https://forums.embarcadero.com/thread.jspa?threadID=103765


My data:
The SQL server is Firebird 2.1.7 (x64)
Delphi XE7 Enterprise + Update1
Windows7/Service Pack1/x64
Available memory: 8387032 KB
http://community.embarcadero.com/index.php/answers/uploading-large-blobs-to-sql-server-varbinary-max-field-does-firedac-help-memory-management

Comments

  1. Presumably the problem is that it is not possible to find 750MB contiguous unused address space in a 32 bit process. 64 bit would solve that. LARGEADDRESSAWARE might also save you. Best would be to process the blob in chunks. No idea if the Delphi BLOB wrappers allow chunk processing.

    ReplyDelete
  2. David Heffernan Hi David and thanks for the proposals.
    Look at my first link: "...Next FireDAC version will introduce "real BLOB streaming" feature, where BLOB parameters will be transferred to a DB without copying a value several times in memory, but..."
    So...  for me - there has a bug :)
    Solving the problem by 64bit way or by using something like IMAGE_FILE_LARGE_ADDRESS_AWARE is not good solution for me.
    I know that chunk processing is better solution but can't find an example or documentation for that. 
    Maybe my question(asking for help) must be:  
    How to insert a big file in a blob field by chunks?
    Because: my program "freezes" during the execution of "ExecSQL". 
    Is there a useful method for showing progres of the process?
    (process the blob in chunks)
    :)

    ReplyDelete
  3. I tried some things in the little hours of the night and... the solution is found...finally. :)
    Now I have a connected with the process progress bar and of course my program does not freeze any more.
    Also... I will do a demo from my stuff as soon as possible. :)

    ReplyDelete
  4. Oliver Funcke Yes - "SetStream" works but... not in all cases (from test results). Also - the effect "freezing" still exist.

    ReplyDelete
  5. The true reason for me and my current problems is that current version of the FireDAC do not support yet the Internal streaming for Firebird and Interbase:
    http://docwiki.embarcadero.com/RADStudio/XE7/en/Support_for_Blob_Streaming_in_FireDAC

    My temporary solutions are:
    1. The first method - insert a new record with a chunk in the field and then update this field in cycle, adding chunk by chunk from the file stream;
    2. The second method - save the file in two database tables. 
     - the header;
     - the data (this table contains the chunks of the file);

    ReplyDelete
  6. For the first method (Insert a chunk and then update the field adding chunk by chunk)... unfortunately works fine only for "small" files;
     - for the "big" files - this method is... very, very slow.

    I just guess - in this way on every step of the process - the component must retrieve the current data from the database. And after adding a chunk - to save all data (including the new chunk) again. Not only the new chunk :(

    - Also:
     --- you must extend your program as already suggested from David which is not ok for me. In cases of "small" files and old PC's it is ok but in the cases of old PC's, "big" files and strange customers... I can't pressure the clients - to make the upgrade for something like this.

    For comparison the methods 'LoadFromStream' or 'AsStream' + x64app are faster (incomparable).
     :)

    ** **
    Details:

    1. Insert the first byte from the file into the BLOB field (this new row has some ID for example);

    2. Add a TFDQuery component to the form (or data module):
    the sql is something like:
    "select first skip 0 file_data from this_table where id = :id order by id desc; "

    3. Add an update object for the field "file_data" for this query;

    4. Update this field - chunk by chunk with something like:
    ** **
    blobF := q_UpdateFile.FieldByName('FILE_DATA') as TBlobField;
    bs := TFDBlobStream.Create(blobF, bmReadWrite);
    bs.Seek(0, soFromEnd);
    bs.CopyFrom(FILE_STREAM, byteCount);
     - do not forget to commit transaction on every step;
     - if your calculations about byteCount are ok - the showing of the progress is trivial;
    ** **

    ReplyDelete
  7. The second method - save the file in two database tables:
     - header_table with some needed data about the file (and the mandatory field ID )
     - data_table which:
     --- contains "ht_id" which is the link to the header_table;
     --- contains "file_data" which is the blob field;
    - in this case - everithing about "freezing" and processing chunk by chunk is clear;
    :) 
    Not so elegance but temporary...

    ReplyDelete
  8. Still trying to imagine the scenario where loading 750MB blobs into a database is a good idea.

    ReplyDelete
  9. Kevin Powick Does it matter?
    I have a client who wants specific things with his own specific reasons.
    750MB is not a common case in the project but is one of the possible variants - something like an upper limit.
    Personally I also think that it is a stupidity of the stupidities but... you know - the client is always right  
    :)

    ReplyDelete
  10. Dobrin Petkov I understand caving to a client's demands, it happens.  However, part of a developer's job is to show our client's better ways of doing things.  Not saying you haven't tried.  It's just a general sentiment.

    ReplyDelete
  11. Kevin Powick Yep, I tried. :)
    This value is an upper limit in the project. Common values (99%) in this project are between 1 Kb and 3Mb.
    But... this is a good reason to improve some things in the FireDAC - don't you think so?
    For the old IB components the problem like this just missing. I tried - there no problem to do the"insert" operation chunk by chunk without the phase of update.

    ReplyDelete

Post a Comment