Export on XLS or CSV very slow

Hi Admin,

I’ve been using scriptcase for about 6 months and currently i have around 30,000 records in my table. I have a grid application to display this records and i want to export all the details to xls or csv file (30,000 records and with 12 columns).

Now when i click export button it will take more than 40 to 60 mins just to export and download it as excel file. I already changed some of the settings on my php.ini like (max_execution, memorylimit, etc…) also i tried to checked my dedicated windows server and increases some of the parameters to speedup my site.

All of this still not solve my problem on exporting my records to excel file still very slow, please if you can advice what else i can do to solve this issue.

I know 30,000 records will still increase because we are on production environment so I’m sure this will be HELL!!! if i will export it to excel file (etc… 1m records).

Admin or Anyone who solve this please advice.

Regards,
Tasyo

I’m pretty sure that it depends on memory, cpu, database (on same or other server?) etc. One option to try is to create the csv by yourself using the php csv functions. The query remains the same and by generating a cursor of your own gives you the opportunity to optimize your code. A button is easily added to the grid application so, the usability of the page doesn’t have to change much.

It functionality is very important for SC, since the end users can download the reports created from SC directly to EXCEL, so SC Team Must try solve it.
I had same problems with some reports for export them to EXCEL. and Hope It issue be fix it.

Hi Aducom thank you for your reply, Please see my Dedicated windows server main specs.

Windows Dedicated Server

  • 4 GB RAM
  • 4 Core CPU
  • 1 TB Disk (RAID-1)
  • 15 TB Bandwidth
  • "[U][I]One option to try is to create the csv by yourself using the php csv functions[/I][/U]" can you please send me the sample function code to do it in the button? i will try if this will solve the issue temporary.
[SIZE=15px]Thanks and Regards, Tasyo[/SIZE] [SIZE=15px] [/SIZE]

This sounds like a setting for memory use is way too small (as in probably still the default).
Increase the following in php.ini: (choose your own values tho)
memory_limit = 512M (or more)

Try this one too:
post_max_size = 512M

Anyway you need to enlarge the memory space used in your scriptcase php settings (and in some cases apache too).
The export code for scriptcase (csv,pdf,…) caches all the data in memory before it writes it away.

You could also help yourself by adding more threads in your apache (examples):
#FcgidIOTimeout 512
FcgidIOTimeout 1024
#FcgidConnectTimeout 512
FcgidConnectTimeout 2048
#50Meg
#FcgidMaxRequestLen 536870912
FcgidMaxRequestLen 1073741824
#FcgidMaxRequestLen 52428800
#FcgidMaxRequestLen 800000000
FcgidMaxRequestsPerProcess 10000

Timeout 300
KeepAlive On
#KeepAliveTimeout 50
#KeepAliveTimeout 10
KeepAliveTimeout 5
MaxKeepAliveRequests 500

I’ve been experimenting with it to get it working with 8 users on a 4G machine with

Hi rr thanks for your reply, i just adjust some of the parameters on my php.ini file but still very slow to export.
Hope admin will reply on this post.

SC uses PHPExcel, a very good library for importing and exporting xlsx files. I use it extensively from my own custom PHP code to do both functions (not using the built-in SC export button on the grid, etc). The reason I do this in my own code is to optimize my export and import process to make it acceptably fast. I only occasionally use teh SC export button, so I can’t directly advise you on how to improve that, but if you google PHPExcel you will find lots of great documentation and examples. You can just include the library that SC has already packaged up for you in SC.

That said, here is some general advise on both export and import - based on some extensive work I have done here using SC and the PHPExcel library:

  1. PHPExcel is very stable and robust and used by lots of people. I did try some other libraries and kept coming back to PHPExcel.

  2. You might be able to use a MYSQL command and some php code to basically write out your huge table directly to a file:
    SELECT * INTO OUTFILE “c:/datafile.csv” FIELDS TERMINATED BY ‘,’ OPTIONALLY ENCLOSED BY ‘"’ LINES TERMINATED BY "
    " FROM large_data_table; It goes only to csv but it can do large datasets. I implemented a full up test of this from some custom code in SC, and it worked. But only on my development server. My HIPAA compliant server host has that commend locked out for security reasons, so I had to abandon it for production site.

  3. I have large datasets to export, but the real time eater for me has been exporting with formatting and validation rules, etc - to an xlsx file. I was always running risk of timing out in PHP. I am not exporting on a lone system where I can afford to use 100% or the memory and set timeout so long. It is in a production site with many users. So I ended up implementing both an exporter and an importer using PHPExcel in batch mode. What I mean is I create an export job - inserting a job into a table that has all the options and setting and filenames and such stored - set a status = ready. Then I have a cron job that runs every minute and looks for a ready export job, grabbing the next job record in line and doing it. It sets the job record status = processing. I also only let such a job ‘run’ (processing records), do as much as it can in say 35 seconds. I have my PHP max execution time set to 60 seconds. This batch processing runs in a SC blank application, for convenience for me keeping all files in the same SC project as rest of apps. After 35 seconds my script cleans up memory and saves off where it is at with pointers and stuff into the same job record, setting status to pending. Then if it takes one or more passes thru the job to do a huge number of records, it can never timeout. I also use a simple SC grid app, refreshing every 20 seconds, to give the users a way to see the status of their job, along with a % progress indicator. When it is done with the file, it makes a link to the exported file (sitting on the server in an off root folder), so they can download - that takes a little coding to do but pretty easy.

  4. I have also found that really large exports might be better to break into multiple xlsx files. So they might have to recombined after tey download but that takes seconds in Excel to do. This is to get around the problem of using too much systems resources (processor and memory). I suspect that is why the export you are doing is running so long. PHP will use up so much stuff that it can’t do but little bits and works it way out. I have to be very careful to unset arrays and do other code to manage things. I even have my batch script check CPU loading and skip a 1 minute cron attempt if there are too many people doing things ,etc. I suggest you get graphs of the system and memory from your server host. That helped me see what was happening… I would go like 500% cpu load, etc! Now I run like 6%.

  5. These libraries can generally export csv like a zillion times faster. Excel files are much more time consuming I have found. In my case, I have to have xlsx files in and out.

  6. A note on importing - I actually another library called SPOUT (similar to PHPExcel) to help me with imports. It lets me pull in an uploaded xlsx file and break it into smaller csv files on the server. Then I use PHPExcel to read in large record sets. Again, I found the other library could do the first step without opening up the file in RAM (sucking up all resources). Thru trial and error, anything can be accomplished :slight_smile:

Anyway, I happened on your post and felt like sharing some of my recent experiences. SC has been a great platform for me - you can do a lot with it - especially if you dig into custom code solutions. Just leverage what they give you. Good luck,
Jamie

Thank you for your contribution, I am sure it will serve us very much. ;).

Hi Jamie

I have been having a RAM problem importing XLSX files using PHPExcel. I manage how to use it without problems with smaller files. But with big files the memory use in server goes to 100% and restart server without warning. I saw this option (SPOUT) on Internet and sounds good. But I am trying to manage to make it work with SC. So far I can’t.

Seems that namespace didn’t work in a control aplication of SC.

If you are using SPOUT can you help me how to use it in SC ?

thaks
Jairo

Could you give me some guidance on spout? I find trying to read a large file (24Mb) takes 30 minutes for this statement to execute:
$reader->open($inputFileName); Yet it still times out.