Executing long-running console commands¶
This page describes how to execute long-running console commands, to make sure they don't run out of memory. An example is a custom import command or the indexing command provided by the Solr Bundle.
Reducing memory usage¶
To avoid quickly running out of memory while executing such commands you should make sure to:
Always run in prod environment using:
Avoid Stash (Persistence cache) using to much memory in prod:
If your system is running, or you need to use cache, then disable Stash InMemory cache as it does not limit the amount of items in cache and grows exponentially:
# config\prod.yml (snippet, not a full example for stash config) stash: caches: default: inMemory: false
Also if you use FileSystem driver, make sure
memKeyLimitis set to a low number, default should be 200 and can be lowered like this:
# config\prod.yml stash: caches: default: FileSystem: memKeyLimit: 100
If your setup is offline and cache is cold, there is no risk of stale cache and you can actually completely disable Stash cache. This will improve performance of import scripts:
# config\prod.yml (full example) stash: caches: default: drivers: [ BlackHole ] inMemory: false
For logging using monolog, if you use either the default
bufferhandler, make sure to specify
buffer_sizeto limit how large the buffer grows before it gets flushed:
# config\_prod.yml (snippet, not a full example for monolog config) monolog: handlers: main: type: fingers_crossed buffer_size: 200
Run PHP without memory limits using:
php -d memory_limit=-1 app/console <command>
xdebug(PHP extension to debug/profile php use) when running the command, this will cause php to use much more memory.
Memory will still grow
Even when everything is configured like described above, memory will grow for each iteration of indexing/inserting a content item with at least 1kb per iteration after the initial first 100 rounds. This is expected behavior; to be able to handle more iterations you will have to do one or several of the following:
- Change the import/index script in question to use process forking to avoid the issue.
- Upgrade PHP: newer versions of PHP are typically more memory-efficient.
- Run the console command on a machine with more memory (RAM).
Process forking with Symfony¶
The recommended way to completely avoid "memory leaks" in PHP in the first place is to use processes, and for console scripts this is typically done using process forking which is quite easy to do with Symfony.
The things you will need to do:
Change your command so it supports taking slice parameters, like for instance a batch size and a child-offset parameter.
- If defined, child-offset parameter denotes if a process is child, this could have been accomplished with two commands as well.
- If not defined, it is master process which will execute the processes until nothing is left to process.
Change the command so that the master process takes care of forking child processes in slices.