Hi
I have written a script that takes a text file as input, and for each text file line, adds a particle (using emit) to a previously created paritcle system. Then, I use the particle -e command to edit the particle color to information that is also in that particles text file line -
so a line of the text file is just:
[x pos], [y pos], [z pos],[r color],[gcolor],[b color]
The problem is, that the text files I am importing in are extremely big - 26mb - some over 1million lines of particle information...
When I cut down the size of the text file to just a few thousand lines, the MEL script works great - and doesn't take too long..
However when using one of the larger files, it just takes for ever to import the particle cloud... infact, I have never waited long enough for one to fully import (even after 3 hours, i dont reckon it is even close to being done).
Does anyone have any suggestions on how to speed this up -
again - the basics of the script..
fopen and fgetline to read the lines of the text file..
a simple loop that creates a particle (using emit), and then edits the particle to set its rgbPP.
I have also got the script to disable both the undo queue, and history.