Issues

ZF-6383: limit raised on preg_replace when using javascriptCaptureStart/javascriptCaptureEnd on a long script

Description

I discovered an issue when using the javascriptCaptureStart/javascriptCaptureEnd function pair on a captured exceeding a certain limit (about 50KB in my case). The function returns only a semi-colon.

I have the following code in a view script: <?php $this->dojo()->javascriptCaptureStart(); ?> var data = <?php echo $this->data; ?>; ... <?php $this->dojo()->javascriptCaptureEnd(); ?>

And, as the number of rows in my database table is growing, $this->data is getting bigger and bigger. Finally, over ~50KB, the PHP tag pair is replaced by a semi-colon and nothing else (not even the "var data =" preceeding that tag). I was able to trace this issue back to the function addJavascript($js) in Zend_Dojo_View_Helper_Dojo_Container and more precisely to the preg_replace function:

If you look closely to the regexp, you'll see a question mark following "." in parenthesis. On my server: $a = str_repeat('a', 49997); $a = preg_replace('/^\s(.?)\s$/s', '$1', $a); would return the string but: $a = str_repeat('a', 49998); $a = preg_replace('/^\s(.?)\s*$/s', '$1', $a); would return NULL. If I remove the question mark, preg_replace operates properly, whatever size the string is.

Eric Coleman said it could be solved by a configuration directive: http://us.php.net/manual/en/pcre.configuration.php

Could it be those 2 configurable limits causing the problems? We had some problems here a while back with hitting the backtrack limit which was causing weird behaviour, perhaps your running into the same issues.

Matthew Weier O'Phinney suggested the following solution:

if (preg_match('/^(\s+)/', $a, $matches)) {
    $a = substr($a, strlen($matches[1]));
}
if (preg_match('/(\s+)$/', $a, $matches)) {
    $a = substr($a, 0, strlen($a) - strlen($matches[1]));
}

Comments

Please reopen if I am wrong, but if you are hitting limits set by PHP, then you should raise those limits or find an alternative route to express large amounts of data. If your dataset groups, perhaps it should be loaded through a separate request (ajax), in chunks.