ZF-6383: limit raised on preg_replace when using javascriptCaptureStart/javascriptCaptureEnd on a long script

Issue Type: Bug Created: 2009-04-22T09:38:41.000+0000 Last Updated: 2011-02-18T09:44:09.000+0000 Status: Resolved Fix version(s): Reporter: Guillaume ORIOL (goriol) Assignee: Ralph Schindler (ralph) Tags: - Zend_View

Related issues: Attachments:


I discovered an issue when using the javascriptCaptureStart/javascriptCaptureEnd function pair on a captured exceeding a certain limit (about 50KB in my case). The function returns only a semi-colon.

I have the following code in a view script: dojo()->javascriptCaptureStart(); ?> var data = data; ?>; ... dojo()->javascriptCaptureEnd(); ?>

And, as the number of rows in my database table is growing, $this->data is getting bigger and bigger. Finally, over ~50KB, the PHP tag pair is replaced by a semi-colon and nothing else (not even the "var data =" preceeding that tag). I was able to trace this issue back to the function addJavascript($js) in Zend_Dojo_View_Helper_Dojo_Container and more precisely to the preg_replace function:

If you look closely to the regexp, you'll see a question mark following "." in parenthesis. On my server: $a = str_repeat('a', 49997); $a = preg_replace('/^\s(.?)\s$/s', '$1', $a); would return the string but: $a = str_repeat('a', 49998); $a = preg_replace('/^\s_(._?)\s*$/s', '$1', $a); would return NULL. If I remove the question mark, preg_replace operates properly, whatever size the string is.

Eric Coleman said it could be solved by a configuration directive:

Could it be those 2 configurable limits causing the problems? We had some problems here a while back with hitting the backtrack limit which was causing weird behaviour, perhaps your running into the same issues.

Matthew Weier O'Phinney suggested the following solution:

if (preg_match('/^(\s+)/', $a, $matches)) {
    $a = substr($a, strlen($matches[1]));
if (preg_match('/(\s+)$/', $a, $matches)) {
    $a = substr($a, 0, strlen($a) - strlen($matches[1]));


Posted by Ralph Schindler (ralph) on 2011-02-18T09:44:09.000+0000

Please reopen if I am wrong, but if you are hitting limits set by PHP, then you should raise those limits or find an alternative route to express large amounts of data. If your dataset groups, perhaps it should be loaded through a separate request (ajax), in chunks.

Have you found an issue?

See the Overview section for more details.


© 2006-2018 by Zend, a Rogue Wave Company. Made with by awesome contributors.

This website is built using zend-expressive and it runs on PHP 7.