+
Skip to content
This repository was archived by the owner on Aug 31, 2023. It is now read-only.

bench: Add JSON support to xtask bench #3845

Merged
merged 3 commits into from
Nov 28, 2022
Merged

Conversation

MichaReiser
Copy link
Contributor

Summary

Add new JSON benchmarks to xtask bench.

Doing so required to refactor xtask bench to have generic parse, analyze and format functions.

Test Plan

I ran xtask bench for the parser, formatter, and linter, with criterion on that enabled/disabled.

@MichaReiser MichaReiser added the A-Tooling Area: our own build, development, and release tooling label Nov 24, 2022
let parse_duration = parser_timer.stop();

#[cfg(feature = "dhat-heap")]
println!("Parsed");
#[cfg(feature = "dhat-heap")]
let stats = print_diff(stats, dhat::HeapStats::get());

let tree_sink_timer = timing::start();
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I removed the timing on the different phases because it otherwise requires opening up some of the parser internals.

@MichaReiser MichaReiser marked this pull request as ready for review November 24, 2022 10:42
@MichaReiser MichaReiser requested a review from xunilrj as a code owner November 24, 2022 10:42
@MichaReiser MichaReiser requested a review from a team November 24, 2022 10:42
@MichaReiser MichaReiser requested review from leops, ematipico and a team as code owners November 25, 2022 11:02
@MichaReiser
Copy link
Contributor Author

!bench_parser

@github-actions
Copy link

Parser Benchmark Results

group                                 main                                   pr
-----                                 ----                                   --
parser/big5-added.json                                                       1.00    194.4±0.12µs    86.9 MB/sec
parser/canada.json                                                           1.00    102.4±3.00ms    21.0 MB/sec
parser/checker.ts                     1.02    124.0±2.15ms    21.0 MB/sec    1.00    121.3±1.90ms    21.4 MB/sec
parser/compiler.js                    1.00     69.9±1.55ms    15.0 MB/sec    1.00     69.7±1.42ms    15.0 MB/sec
parser/d3.min.js                      1.01     42.3±0.58ms     6.2 MB/sec    1.00     41.8±0.56ms     6.3 MB/sec
parser/db.json                                                               1.00      4.9±0.03ms    36.4 MB/sec
parser/dojo.js                        1.00      3.7±0.01ms    18.7 MB/sec    1.00      3.7±0.01ms    18.7 MB/sec
parser/eucjp.json                                                            1.00    311.1±0.27µs   125.9 MB/sec
parser/ios.d.ts                       1.01    108.2±1.41ms    17.2 MB/sec    1.00    107.0±2.01ms    17.4 MB/sec
parser/jquery.min.js                  1.00     11.3±0.04ms     7.3 MB/sec    1.00     11.3±0.04ms     7.3 MB/sec
parser/math.js                        1.00     85.0±1.57ms     7.6 MB/sec    1.01     85.6±5.79ms     7.6 MB/sec
parser/package-lock.json                                                     1.00      2.0±0.01ms    68.1 MB/sec
parser/parser.ts                      1.01      2.6±0.00ms    18.6 MB/sec    1.00      2.6±0.01ms    18.7 MB/sec
parser/pixi.min.js                    1.00     52.7±1.12ms     8.3 MB/sec    1.00     52.6±1.19ms     8.4 MB/sec
parser/react-dom.production.min.js    1.00     15.2±0.07ms     7.6 MB/sec    1.00     15.2±0.06ms     7.6 MB/sec
parser/react.production.min.js        1.00    791.8±1.36µs     7.8 MB/sec    1.00    792.6±1.39µs     7.8 MB/sec
parser/router.ts                      1.00   1289.7±2.54µs    26.4 MB/sec    1.01   1297.4±2.93µs    26.3 MB/sec
parser/tex-chtml-full.js              1.00    116.3±1.19ms     7.8 MB/sec    1.01    117.3±1.98ms     7.8 MB/sec
parser/three.min.js                   1.00     57.4±1.05ms    10.2 MB/sec    1.01     57.9±1.19ms    10.1 MB/sec
parser/typescript.js                  1.02    493.5±3.43ms    19.3 MB/sec    1.00    486.0±3.31ms    19.5 MB/sec
parser/vue.global.prod.js             1.00     18.5±0.08ms     6.5 MB/sec    1.00     18.6±0.08ms     6.5 MB/sec

Base automatically changed from feat/json-parser to main November 25, 2022 13:51
@netlify
Copy link

netlify bot commented Nov 25, 2022

Deploy Preview for docs-rometools canceled.

Name Link
🔨 Latest commit 9a5c13e
🔍 Latest deploy log https://app.netlify.com/sites/docs-rometools/deploys/6380ff80a5af5400094b101b

@calibre-analytics
Copy link

calibre-analytics bot commented Nov 25, 2022

Comparing bench: Add JSON support to xtask bench Snapshot #4 to median since last deploy of rome.tools.

LCP? CLS? TBT?
Overall
Median across all pages and test profiles
2.48s
from 257ms
0.0
no change
188ms
no change
Chrome Desktop
Chrome Desktop • Cable
2.48s
from 257ms
0.0
no change
344ms
from 20ms
iPhone, 4G LTE
iPhone 12 • 4G LTE
1.11s
from 241ms
0.0
no change
12ms
no change
Motorola Moto G Power, 3G connection
Motorola Moto G Power • Regular 3G
16.5s
from 1.07s
0.0
no change
188ms
no change

1 page tested

 Home

Browser previews

Chrome Desktop iPhone, 4G LTE Motorola Moto G Power, 3G connection
Chrome Desktop iPhone, 4G LTE Motorola Moto G Power, 3G connection

Most significant changes

Value Budget
JS Parse & Compile
Motorola Moto G Power, 3G connection
1.77s
from 27ms
Total JavaScript Size in Bytes
Chrome Desktop
5.36 MB
from 86.8 KB
Total JavaScript Size in Bytes
iPhone, 4G LTE
5.36 MB
from 86.8 KB
Total JavaScript Size in Bytes
Motorola Moto G Power, 3G connection
5.36 MB
from 86.8 KB
JS Parse & Compile
iPhone, 4G LTE
490ms
from 11ms

27 other significant changes: JS Parse & Compile on Chrome Desktop, Total Blocking Time on Chrome Desktop, Largest Contentful Paint on Motorola Moto G Power, 3G connection, First Contentful Paint on Motorola Moto G Power, 3G connection, Total CSS Size in Bytes on Chrome Desktop, Total CSS Size in Bytes on iPhone, 4G LTE, Total CSS Size in Bytes on Motorola Moto G Power, 3G connection, Time to Interactive on Motorola Moto G Power, 3G connection, Total Page Size in Bytes on Chrome Desktop, Total Page Size in Bytes on iPhone, 4G LTE, Total Page Size in Bytes on Motorola Moto G Power, 3G connection, Largest Contentful Paint on Chrome Desktop, First Contentful Paint on Chrome Desktop, Time to Interactive on Chrome Desktop, Number of Requests on Motorola Moto G Power, 3G connection, Number of Requests on Chrome Desktop, Number of Requests on iPhone, 4G LTE, Speed Index on Motorola Moto G Power, 3G connection, Time to Interactive on iPhone, 4G LTE, First Contentful Paint on iPhone, 4G LTE, Largest Contentful Paint on iPhone, 4G LTE, Speed Index on Chrome Desktop, Total HTML Size in Bytes on Chrome Desktop, Total HTML Size in Bytes on iPhone, 4G LTE, Total HTML Size in Bytes on Motorola Moto G Power, 3G connection, Lighthouse Performance Score on Motorola Moto G Power, 3G connection, Lighthouse Performance Score on Chrome Desktop

Calibre: Site dashboard | View this PR | Edit settings | View documentation

@github-actions
Copy link

Parser conformance results on ubuntu-latest

js/262

Test result main count This PR count Difference
Total 45879 45879 0
Passed 44936 44936 0
Failed 943 943 0
Panics 0 0 0
Coverage 97.94% 97.94% 0.00%

jsx/babel

Test result main count This PR count Difference
Total 39 39 0
Passed 36 36 0
Failed 3 3 0
Panics 0 0 0
Coverage 92.31% 92.31% 0.00%

symbols/microsoft

Test result main count This PR count Difference
Total 5946 5946 0
Passed 1757 1757 0
Failed 4189 4189 0
Panics 0 0 0
Coverage 29.55% 29.55% 0.00%

ts/babel

Test result main count This PR count Difference
Total 588 588 0
Passed 519 519 0
Failed 69 69 0
Panics 0 0 0
Coverage 88.27% 88.27% 0.00%

ts/microsoft

Test result main count This PR count Difference
Total 16257 16257 0
Passed 12397 12397 0
Failed 3860 3860 0
Panics 0 0 0
Coverage 76.26% 76.26% 0.00%

@ematipico
Copy link
Contributor

I wonder if we should specialize the commands now. Like, I don't think we all want to run all the parsers using bench_parser command, especially in those PRs where we change things in one single parser.

Maybe we should have commands like:

  • bench_parser_js
  • bench_parser_json

Same thing for the formatter

@MichaReiser
Copy link
Contributor Author

MichaReiser commented Nov 25, 2022

I wonder if we should specialize the commands now. Like, I don't think we all want to run all the parsers using bench_parser command, especially in those PRs where we change things in one single parser.

Maybe we should have commands like:

* `bench_parser_js`

* `bench_parser_json`

Same thing for the formatter

You can pass a test suite and even filter by test name

cargo bench_parser --suites js --suite json --filter <someName>

@ematipico
Copy link
Contributor

Sorry, I actually meant the magic comments we use here to run the benchmarks in the GitHub actions. I didn't put the ! to trigger them by mistake 🤣

@MichaReiser
Copy link
Contributor Author

Sorry, I actually meant the magic comments we use here to run the benchmarks in the GitHub actions. I didn't put the ! to trigger them by mistake 🤣

That could be useful. The JSON tests don't add much but this is something we can tackle separately

use rome_js_syntax::{JsAnyRoot, JsSyntaxNode, SourceType};
use rome_parser::prelude::ParseDiagnostic;

pub enum Parse<'a> {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: these could probably be implemented as traits instead of enums, to avoid introducing an extra match in the benchmark loop (although its impact on the benchmark is probably insignificant)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using traits makes the benchmark code somewhat awkward because it would need different code paths for each language to avoid a dynamic dispatch (using &dyn). That's why I prefer the current solution, particularly because the match overhead should be neglectable considering what we're measuring in the benchmarks.

@MichaReiser
Copy link
Contributor Author

!bench_formatter

@github-actions
Copy link

Formatter Benchmark Results

group                                    main                                   pr
-----                                    ----                                   --
formatter/big5-added.json                                                       1.00    315.2±0.31µs    53.6 MB/sec
formatter/canada.json                                                           1.00     63.1±0.66ms    34.0 MB/sec
formatter/checker.ts                     1.02    417.3±3.26ms     6.2 MB/sec    1.00    409.5±3.03ms     6.3 MB/sec
formatter/compiler.js                    1.01    222.8±1.84ms     4.7 MB/sec    1.00    219.5±1.17ms     4.8 MB/sec
formatter/d3.min.js                      1.00    171.8±1.49ms  1562.7 KB/sec    1.01    173.2±1.86ms  1549.8 KB/sec
formatter/db.json                                                               1.00      5.4±0.01ms    33.3 MB/sec
formatter/dojo.js                        1.01     11.4±0.06ms     6.0 MB/sec    1.00     11.3±0.04ms     6.1 MB/sec
formatter/eucjp.json                                                            1.00    622.9±0.37µs    62.9 MB/sec
formatter/ios.d.ts                       1.02    247.4±2.33ms     7.5 MB/sec    1.00    243.0±3.73ms     7.7 MB/sec
formatter/jquery.min.js                  1.00     46.5±0.37ms  1819.9 KB/sec    1.00     46.6±0.32ms  1817.8 KB/sec
formatter/math.js                        1.02    344.0±2.01ms  1927.8 KB/sec    1.00    338.2±2.19ms  1960.7 KB/sec
formatter/package-lock.json                                                     1.00      2.6±0.01ms    53.7 MB/sec
formatter/parser.ts                      1.00      7.7±0.04ms     6.3 MB/sec    1.00      7.7±0.02ms     6.3 MB/sec
formatter/pixi.min.js                    1.01    186.1±2.15ms     2.4 MB/sec    1.00    184.3±1.52ms     2.4 MB/sec
formatter/react-dom.production.min.js    1.00     54.5±0.73ms     2.1 MB/sec    1.01     54.9±0.61ms     2.1 MB/sec
formatter/react.production.min.js        1.00      2.7±0.01ms     2.3 MB/sec    1.00      2.7±0.01ms     2.3 MB/sec
formatter/router.ts                      1.00      3.6±0.01ms     9.4 MB/sec    1.00      3.6±0.01ms     9.4 MB/sec
formatter/tex-chtml-full.js              1.01    436.3±2.66ms     2.1 MB/sec    1.00    432.2±1.65ms     2.1 MB/sec
formatter/three.min.js                   1.00    222.5±1.49ms     2.6 MB/sec    1.00    222.0±2.48ms     2.6 MB/sec
formatter/typescript.js                  1.02   1519.8±8.70ms     6.3 MB/sec    1.00   1485.8±5.74ms     6.4 MB/sec
formatter/vue.global.prod.js             1.00     71.8±1.00ms  1717.5 KB/sec    1.01     72.5±0.77ms  1702.9 KB/sec

@MichaReiser MichaReiser merged commit 4b7b81d into main Nov 28, 2022
@MichaReiser MichaReiser deleted the perf/json-benchmarks branch November 28, 2022 09:21
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
A-Tooling Area: our own build, development, and release tooling
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants
点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载