Permalink
Switch branches/tags
{{ refName }}
default
6.3
Could not load branches
Nothing to show
Could not load tags
Nothing to show
{{ refName }}
default
Name already in use
A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Commits on Dec 29, 2022
-
* 6.2: [Cache] Fix possibly null value passed to preg_match() in RedisTrait
-
* 6.1: [Cache] Fix possibly null value passed to preg_match() in RedisTrait
-
* 6.0: [Cache] Fix possibly null value passed to preg_match() in RedisTrait
-
* 5.4: [Cache] Fix possibly null value passed to preg_match() in RedisTrait
-
bug #48823 [Cache] Fix possibly null value passed to preg_match() in …
…RedisTrait (chalasr) This PR was merged into the 5.4 branch. Discussion ---------- [Cache] Fix possibly null value passed to preg_match() in RedisTrait | Q | A | ------------- | --- | Branch? | 5.4 | Bug fix? | yes | New feature? | no | Deprecations? | no | Tickets | - | License | MIT | Doc PR | - <!-- Replace this notice by a short README for your feature/bugfix. This will help reviewers and should be a good start for the documentation. Additionally (see https://symfony.com/releases): - Always add tests and ensure they pass. - Bug fixes must be submitted against the lowest maintained branch where they apply (lowest branches are regularly merged to upper ones so they get the fixes too). - Features and deprecations must be submitted against the latest branch. - For new features, provide some code snippets to help understand usage. - Changelog entry should follow https://symfony.com/doc/current/contributing/code/conventions.html#writing-a-changelog-entry - Never break backward compatibility (see https://symfony.com/bc). --> Commits ------- 89e6c2f [Cache] Fix possibly null value passed to preg_match() in RedisTrait
-
feature #47709 [HttpFoundation] Add
StreamedJsonResponse
for effici……ent JSON streaming (alexander-schranz) This PR was squashed before being merged into the 6.3 branch. Discussion ---------- [HttpFoundation] Add `StreamedJsonResponse` for efficient JSON streaming | Q | A | ------------- | --- | Branch? | 6.2 | Bug fix? | no | New feature? | yes <!-- please update src/**/CHANGELOG.md files --> | Deprecations? | no <!-- please update UPGRADE-*.md and src/**/CHANGELOG.md files --> | Tickets | Fix #... <!-- prefix each issue number with "Fix #", no need to create an issue if none exist, explain below instead --> | License | MIT | Doc PR | symfony/symfony-docs#17301 When big data are streamed via JSON API it can sometimes be difficult to keep the resources usages low. For this I experimented with a different way of streaming data for JSON responses. It uses combination of `structured array` and `generics` which did result in a lot better result. More can be read about here: [https://github.com/alexander-schranz/efficient-json-streaming-with-symfony-doctrine](https://github.com/alexander-schranz/efficient-json-streaming-with-symfony-doctrine). I thought it maybe can be a great addition to Symfony itself to make this kind of responses easier and that APIs can be made more performant. ## Usage <details><summary>First Version (replaced)</summary> ```php class ArticleListAction { public function __invoke(EntityManagerInterface $entityManager): Response { $articles = $this->findArticles($entityManager); return new StreamedJsonResponse( // json structure with replacers identifiers [ '_embedded' => [ 'articles' => '__articles__', ], ], // array of generator replacer identifier used as key [ '__articles__' => $this->findArticles('Article'), ] ); } private function findArticles(EntityManagerInterface $entityManager): \Generator { $queryBuilder = $entityManager->createQueryBuilder(); $queryBuilder->from(Article::class, 'article'); $queryBuilder->select('article.id') ->addSelect('article.title') ->addSelect('article.description'); return $queryBuilder->getQuery()->toIterable(); } } ``` </details> Update Version (thx to `@ro0NL` for the idea): ```php class ArticleListAction { public function __invoke(EntityManagerInterface $entityManager): Response { $articles = $this->findArticles($entityManager); return new StreamedJsonResponse( // json structure with generators in it which are streamed [ '_embedded' => [ 'articles' => $this->findArticles('Article'), // returns a generator which is streamed ], ], ); } private function findArticles(EntityManagerInterface $entityManager): \Generator { $queryBuilder = $entityManager->createQueryBuilder(); $queryBuilder->from(Article::class, 'article'); $queryBuilder->select('article.id') ->addSelect('article.title') ->addSelect('article.description'); return $queryBuilder->getQuery()->toIterable(); } } ``` ---- As proposed by `@OskarStark` the Full Content of Blog about ["Efficient JSON Streaming with Symfony and Doctrine"](https://github.com/alexander-schranz/efficient-json-streaming-with-symfony-doctrine/edit/main/README.md): # Efficient JSON Streaming with Symfony and Doctrine After reading a tweet about we provide only a few items (max. 100) over our JSON APIs but providing 4k images for our websites. I did think about why is this the case. The main difference first we need to know about how images are streamed. On webservers today is mostly the sendfile feature used. Which is very efficient as it can stream a file chunk by chunk and don't need to load the whole data. So I'm asking myself how we can achieve the same mechanisms for our JSON APIs, with a little experiment. As an example we will have a look at a basic entity which has the following fields defined: - id: int - title: string - description: text The response of our API should look like the following: ```json { "_embedded": { "articles": [ { "id": 1, "title": "Article 1", "description": "Description 1\nMore description text ...", }, ... ] } } ``` Normally to provide this API we would do something like this: ```php <?php namespace App\Controller; use App\Entity\Article; use Doctrine\ORM\EntityManagerInterface; use Symfony\Component\HttpFoundation\JsonResponse; use Symfony\Component\HttpFoundation\Response; class ArticleListAction { public function __invoke(EntityManagerInterface $entityManager): Response { $articles = $this->findArticles($entityManager); return JsonResponse::fromJsonString(json_encode([ 'embedded' => [ 'articles' => $articles, ], 'total' => 100_000, ], JSON_THROW_ON_ERROR | JSON_UNESCAPED_SLASHES | JSON_UNESCAPED_UNICODE)); } // normally this method would live in a repository private function findArticles(EntityManagerInterface $entityManager): iterable { $queryBuilder = $entityManager->createQueryBuilder(); $queryBuilder->from(Article::class, 'article'); $queryBuilder->select('article.id') ->addSelect('article.title') ->addSelect('article.description'); return $queryBuilder->getQuery()->getResult(); } } ``` In most cases we will add some pagination to the endpoint so our response are not too big. ## Making the api more efficient But there is also a way how we can stream this response in an efficient way. First of all we need to adjust how we load the articles. This can be done by replace the `getResult` with the more efficient [`toIterable`](https://www.doctrine-project.org/projects/doctrine-orm/en/2.9/reference/batch-processing.html#iterating-results): ```diff - return $queryBuilder->getQuery()->getResult(); + return $queryBuilder->getQuery()->toIterable(); ``` Still the whole JSON need to be in the memory to send it. So we need also refactoring how we are creating our response. We will replace our `JsonResponse` with the [`StreamedResponse`](https://symfony.com/doc/6.0/components/http_foundation.html#streaming-a-response) object. ```php return new StreamedResponse(function() use ($articles) { // stream json }, 200, ['Content-Type' => 'application/json']); ``` But the `json` format is not the best format for streaming, so we need to add some hacks so we can make it streamable. First we will create will define the basic structure of our JSON this way: ```php $jsonStructure = json_encode([ 'embedded' => [ 'articles' => ['__REPLACES_ARTICLES__'], ], 'total' => 100_000, ], JSON_THROW_ON_ERROR | JSON_UNESCAPED_SLASHES | JSON_UNESCAPED_UNICODE); ``` Instead of the `$articles` we are using a placeholder which we use to split the string into a `$before` and `$after` variable: ```php [$before, $after] = explode('"__REPLACES_ARTICLES__"', $jsonStructure, 2); ``` Now we are first sending the `$before`: ```php echo $before . PHP_EOL; ``` Then we stream the articles one by one to it here we need to keep the comma in mind which we need to add after every article but not the last one: ```php foreach ($articles as $count => $article) { if ($count !== 0) { echo ',' . PHP_EOL; // if not first element we need a separator } echo json_encode($article, JSON_THROW_ON_ERROR | JSON_UNESCAPED_SLASHES | JSON_UNESCAPED_UNICODE); } ``` Also we will add an additional `flush` after every 500 elements: ```php if ($count % 500 === 0 && $count !== 100_000) { // flush response after every 500 flush(); } ``` After that we will also send the `$after` part: ```php echo PHP_EOL; echo $after; ``` ## The result So at the end the whole action looks like the following: ```php <?php namespace App\Controller; use App\Entity\Article; use Doctrine\ORM\EntityManagerInterface; use Symfony\Component\HttpFoundation\Response; use Symfony\Component\HttpFoundation\StreamedResponse; class ArticleListAction { public function __invoke(EntityManagerInterface $entityManager): Response { $articles = $this->findArticles($entityManager); return new StreamedResponse(function() use ($articles) { // defining our json structure but replaces the articles with a placeholder $jsonStructure = json_encode([ 'embedded' => [ 'articles' => ['__REPLACES_ARTICLES__'], ], 'total' => 100_000, ], JSON_THROW_ON_ERROR | JSON_UNESCAPED_SLASHES | JSON_UNESCAPED_UNICODE); // split by placeholder [$before, $after] = explode('"__REPLACES_ARTICLES__"', $jsonStructure, 2); // send first before part of the json echo $before . PHP_EOL; // stream article one by one as own json foreach ($articles as $count => $article) { if ($count !== 0) { echo ',' . PHP_EOL; // if not first element we need a separator } if ($count % 500 === 0 && $count !== 100_000) { // flush response after every 500 flush(); } echo json_encode($article, JSON_THROW_ON_ERROR | JSON_UNESCAPED_SLASHES | JSON_UNESCAPED_UNICODE); } // send the after part of the json as last echo PHP_EOL; echo $after; }, 200, ['Content-Type' => 'application/json']); } private function findArticles(EntityManagerInterface $entityManager): iterable { $queryBuilder = $entityManager->createQueryBuilder(); $queryBuilder->from(Article::class, 'article'); $queryBuilder->select('article.id') ->addSelect('article.title') ->addSelect('article.description'); return $queryBuilder->getQuery()->toIterable(); } } ``` The metrics for 100000 Articles (nginx + php-fpm 7.4 - Macbook Pro 2013): | | Old Implementation | New Implementation | |---------------------------|--------------------|--------------------| | Memory Usage | 49.53 MB | 2.10 MB | | Memory Usage Peak | 59.21 MB | 2.10 MB | | Time to first Byte | 478ms | 28ms | | Time | 2.335 s | 0.584 s | This way we did not only reduce the memory usage on our server also we did make the response faster. The memory usage was measured here with `memory_get_usage` and `memory_get_peak_usage`. The "Time to first Byte" by the browser value and response times over curl. **Updated 2022-10-02 - (symfony serve + php-fpm 8.1 - Macbook Pro 2021)** | | Old Implementation | New Implementation | |---------------------------|--------------------|--------------------| | Memory Usage | 64.21 MB | 2.10 MB | | Memory Usage Peak | 73.89 MB | 2.10 MB | | Time to first Byte | 0.203 s | 0.049 s | | Updated Time (2022-10-02) | 0.233 s | 0.232 s | While there is not much different for a single response in the time, the real performance is the lower memory usage. Which will kick in when you have a lot of simultaneously requests. On my machine >150 simultaneously requests - which is a high value but will on a normal server be a lot lower. While 150 simultaneously requests crashes in the old implementation the new implementation still works with 220 simultaneously requests. Which means we got about ~46% more requests possible. ## Reading Data in javascript As we stream the data we should also make our JavaScript on the other end the same way - so data need to read in streamed way. Here I'm just following the example from the [Fetch API Processing a text file line by line](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API/Using_Fetch#processing_a_text_file_line_by_line) So if we look at our [`script.js`](public/script.js) we split the object line by line and append it to our table. This method is definitely not the way how JSON should be read and parsed. It should only be shown as example how the response could be read from a stream. ## Conclusion The implementation looks a little hacky for maintainability it could be moved into its own Factory which creates this kind of response. Example: ```php return StreamedResponseFactory::create( [ 'embedded' => [ 'articles' => ['__REPLACES_ARTICLES__'], ], 'total' => 100_000, ], ['____REPLACES_ARTICLES__' => $articles] ); ``` The JavaScript part something is definitely not ready for production and if used you should probably creating your own content-type e.g.: `application/json+stream`. So you are parsing the json this way only when you know it is really in this line by line format. There maybe better libraries like [`JSONStream`](https://www.npmjs.com/package/JSONStream) to read data but at current state did test them out. Let me know if somebody has experience with that and has solutions for it. Atleast what I think everybody should use for providing lists is to use [`toIterable`](https://www.doctrine-project.org/projects/doctrine-orm/en/2.9/reference/batch-processing.html#iterating-results) when possible for your lists when loading your data via Doctrine and and select specific fields instead of using the `ORM` to avoid hydration process to object. Let me know what you think about this experiment and how you currently are providing your JSON data. The whole experiment here can be checked out and test yourself via [this repository](https://github.com/alexander-schranz/efficient-json-streaming-with-symfony-doctrine). Attend the discussion about this on [Twitter](https://twitter.com/alex_s_/status/1488314080381313025). ## Update 2022-09-27 Added a [StreamedJsonRepsonse](src/Controller/StreamedJsonResponse.php) class and try to contribute this implementation to the Symfony core. [#47709 ## Update 2022-10-02 Updated some statistics with new machine and apache benchmark tests for concurrency requests. Commits ------- ecc5355 [HttpFoundation] Add `StreamedJsonResponse` for efficient JSON streaming
-
feature #48810 Drop v1 contracts packages everywhere (derrabus)
This PR was merged into the 6.3 branch. Discussion ---------- Drop v1 contracts packages everywhere | Q | A | ------------- | --- | Branch? | 6.3 | Bug fix? | no | New feature? | no | Deprecations? | no | Tickets | N/A | License | MIT | Doc PR | N/A Version 1 of the Symfony contracts have been released during the Symfony 4.x days, but still many Symfony 6 components declare compatibility with them: https://github.com/symfony/symfony/blob/a8bbf632df06b26bbd8a62f22203054bbff34d32/src/Symfony/Component/DependencyInjection/composer.json#L22 This circumstance complicates our build matrix and makes issues like #48792 hard to track down. Version 2 was released together with Symfony 5 which means that we really only need v1 contracts if we want to actively support Symfony 4 components which are EOL and already unsupported by Symfony 6. I make the bold assumption that no Symfony 6 app should be required to pin any of the contracts packages to v1 or even an outdated v2 minor release. This is why I propose to drop support for contracts v1 everywhere it did not happen already, making `^2.5 || ^3` the loosest constraint we use for a contracts package on a Symfony 6.3 component or bundle. Commits ------- 8671ad5 Drop v1 contracts packages everywhere
-
* 6.2: fix for caching without auth parameter, broken by #48711, fix for #48813 Bump Symfony version to 6.2.4 Update VERSION for 6.2.3 Update CHANGELOG for 6.2.3 Bump Symfony version to 6.1.10 Update VERSION for 6.1.9 Update CHANGELOG for 6.1.9 Bump Symfony version to 6.0.18 Update VERSION for 6.0.17 Update CHANGELOG for 6.0.17 Bump Symfony version to 5.4.18 Update VERSION for 5.4.17 Update CONTRIBUTORS for 5.4.17 Update CHANGELOG for 5.4.17 [DependencyInjection] Fix resolving parameters when dumping lazy proxies
-
* 6.1: fix for caching without auth parameter, broken by #48711, fix for #48813 Bump Symfony version to 6.1.10 Update VERSION for 6.1.9 Update CHANGELOG for 6.1.9 Bump Symfony version to 6.0.18 Update VERSION for 6.0.17 Update CHANGELOG for 6.0.17 Bump Symfony version to 5.4.18 Update VERSION for 5.4.17 Update CONTRIBUTORS for 5.4.17 Update CHANGELOG for 5.4.17
-
bug #48816 [Cache] Fix for RedisAdapter without auth parameter (rikvdh)
This PR was merged into the 5.4 branch. Discussion ---------- [Cache] Fix for RedisAdapter without auth parameter | Q | A | ------------- | --- | Branch? | 5.4 | Bug fix? | yes | New feature? | no | Deprecations? | no | Tickets | Fix #48813 | License | MIT | Doc PR | x Compatibility with Redis without auth was broken by #48711, this change fixes this. This applies for all versions (6.x as well). Commits ------- 0cf91ab fix for caching without auth parameter, broken by #48711, fix for #48813
Commits on Dec 28, 2022
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
bug #48805 [DependencyInjection] Fix resolving parameters when dumpin…
…g lazy proxies (nicolas-grekas) This PR was merged into the 6.2 branch. Discussion ---------- [DependencyInjection] Fix resolving parameters when dumping lazy proxies | Q | A | ------------- | --- | Branch? | 6.2 | Bug fix? | yes | New feature? | no | Deprecations? | no | Tickets | - | License | MIT | Doc PR | - Resolving parameters in lazy services' "proxy" tags never worked and the code I added to 6.2 is broken. Let's drop it. Commits ------- 123d453 [DependencyInjection] Fix resolving parameters when dumping lazy proxies