- use nodes -> MEL is about using nodes
- disable undo.
- stop updating the global selection list.
- dont rely on auto naming as its a n linear operation doing it in loop makes it a n^2 operation. Or in a loop of loops a n^complexity+1
- don't do massive loops within loops.
- don't brute force -> see if you can make the algorithm a n log n one
- user doesn't have enough memory -> buy more if you have less than 8 gigabytes better yet get 32
- trying to update BIG amount of vertex indices in a loop -> make a node
- ...
Programming something that works but doesn't scale is easy, doing something that scales and works is not necessarily so easy anymore it suddenly requires a programmer for algorithm design.
Thing is this is actually not likely to be mels fault, its not often even about the fact that mel is in general slow. Its much more often about the wrioer of the mel code abusing maya in a unproductive way. And not leveraging on what mel is supposed to do.
A illustratuve example:
once years back, a user was frustrated that mel was slow. He basically had a loop that looked like follows:
for($i=0;$i<10000;$i++){
for (object in lots of objects){
many getAtt functions with time -1 as sample time
do a computation
many setattr functions.
many set keyframe functions
}
currentTime $i;
}
now his problem was that this took forever to do, partly becasue his computation was redundant. But much more importantly becasue of currentTime $i; the script had worked him well fro many years but he had never tested it in this grand scale.
Turns out that the problem was solved as follows:
make it a expression and canche the valuesdisable viwport refresh (turns out it took his computer 0.2 seconds to refresh the screen, well you know what 0.2 * 10000 = 2000 seconds in it self is already more than 30 minutes)finally use bakingthis made his multi hour script execute in much less than a second.
So really to say something of value id would need to know what it is your script does.