Pragmatic Web Performance

A presentation at GitLab Virtual Contribute in April 2020 in by Denys Mishunov

Slide 1

Slide 1

PRAGUEMATIC WEB PERFORMANCE DENYS MISHUNOV Senior Frontend Engineer, Create::Editor

Slide 2

Slide 2

IT’S ABOUT FRONT-END

Slide 3

Slide 3

How long is forever? Sometimes, just one second

Slide 4

Slide 4

PRAGUEMATIC WEB PERFORMANCE

Slide 5

Slide 5

WEB PERFORMANCE 2020

Slide 6

Slide 6

“OFTEN POOR RESPONSE TIME ON WEBSITE”

Slide 7

Slide 7

“PERFORMANCE… OTHERWISE A GREAT TOOL”

Slide 8

Slide 8

“IT’S JUST SLOW”

Slide 9

Slide 9

E S N O P S E R R O ” “ O E P T P I E S N R B E F T E O F W R O O M “ N T A O H N E E C M R I E W T T … O I S ” O E L A ” G W R E O A T L S T S U J S ’ T I “ GitLab, UX research by Jeff Crow and Farnoosh Seifoddini

Slide 10

Slide 10

E S N O P S E R R O ” “ O E P T P I E S N R B E F T E O F W R O O M “ N T A O H N E E C M R I E W T T … O I S ” O E L A ” G W R E O A T L S 18.5% Dedicated the time to say how not impressed they are with performance T S U J S ’ T I “ GitLab, UX research by Jeff Crow and Farnoosh Seifoddini

Slide 11

Slide 11

E S N O P S E R R O ” “ O E P T P I E S N R B E F T E O F W R O O M “ N T A O H N E E C M R I E W T T … O I S ” O E L A ” G W R E O A T L S Q2 OKRs https://gitlab.com/groups/gitlab-com/-/epics/464 T S U J S ’ T I “

Slide 12

Slide 12

WEB PERFORMANCE 2020

Slide 13

Slide 13

PRAGMATIC prag • mat • ic adj. Dealing or concerned with facts or actual occurrences; practical

Slide 14

Slide 14

“IT’S OK, BUT SPEED MAY BE IMPROVED”

Slide 15

Slide 15

MONITOR • OPTIMISE

Slide 16

Slide 16

MEASURE • MONITOR • OPTIMISE

Slide 17

Slide 17

PRAGMATIC MEASURE • MONITOR • OPTIMISE

Slide 18

Slide 18

1.MEASURE

Slide 19

Slide 19

WHAT TO MEASURE? • Overall page size • Number of server requests • The size of the bundled JS resources • etc.

Slide 20

Slide 20

WHAT TO MEASURE? PAGE LOADING TIME

Slide 21

Slide 21

WHAT TO MEASURE? PAGE LOADING TIME

Slide 22

Slide 22

WHAT TO MEASURE? PAGE LOADING TIME • DOMContentLoaded Event • Onload Event • First Paint • First Contentful Paint • First Meaningful Paint • Largest Contentful Paint • SpeedIndex

Slide 23

Slide 23

52 Performance (Fast 3G connection, caching disabled) First Contentful Paint 2.9 s Speed Index 3.2 s Time to Interactive 10.1 s First Meaningful Paint 7.7 s First CPU Idle 10.0 s Max Potential First Input Delay 1.160 ms

Slide 24

Slide 24

Generic First Contentful Paint 2.9 s Speed Index 3.2 s Time to Interactive 10.1 s First Meaningful Paint 7.7 s First CPU Idle 10.0 s Max Potential First Input Delay 1.160 ms Could be fragile for monitoring There are too many

Slide 25

Slide 25

First Contentful Paint 2.9 s Speed Index 3.2 s Time to Interactive 10.1 s First Meaningful Paint 7.7 s First CPU Idle 10.0 s Max Potential First Input Delay 1.160 ms HOW SOON DOES USER SEE THE SNIPPET?

Slide 26

Slide 26

WHAT TO MEASURE? PAGE LOADING TIME • DOMContentLoaded Event • Onload Event • First Paint • First Contentful Paint • First Meaningful Paint • Largest Contentful Paint • SpeedIndex

Slide 27

Slide 27

WHAT TO MEASURE? PAGE LOADING TIME USER TIMING API

Slide 28

Slide 28

PERFORMANCE API https://developer.mozilla.org/en-US/docs/Web/API/Performance

Slide 29

Slide 29

PERFORMANCE API https://developer.mozilla.org/en-US/docs/Web/API/Performance Performance Timeline API

Slide 30

Slide 30

PERFORMANCE API https://developer.mozilla.org/en-US/docs/Web/API/Performance Performance Timeline API Navigation Timing

  • API

Slide 31

Slide 31

PERFORMANCE API https://developer.mozilla.org/en-US/docs/Web/API/Performance Performance Timeline API Navigation Timing

  • API Resource Timing API https://mzl.la/2Kg4oIT

Slide 32

Slide 32

PERFORMANCE API https://developer.mozilla.org/en-US/docs/Web/API/Performance Performance Timeline API Navigation Timing

  • API Resource Timing API User Timing API

Slide 33

Slide 33

WHAT TO MEASURE? PAGE LOADING TIME USER TIMING API

Slide 34

Slide 34

SHOW ME HOW PRACTICAL EXAMPLE

Slide 35

Slide 35

Slide 36

Slide 36

Slide 37

Slide 37

Slide 38

Slide 38

performance.measure(MEASURE_NAME, START_MARK, END_MARK);

Slide 39

Slide 39

window.requestAnimationFrame(() => { performance.measure(MEASURE_NAME, START_MARK, END_MARK); })

Slide 40

Slide 40

Slide 41

Slide 41

Slide 42

Slide 42

IS VUE THE PROBLEM?

Slide 43

Slide 43

Slide 44

Slide 44

Slide 45

Slide 45

Slide 46

Slide 46

75% waiting to bootstrap the Vue application*

  • most probably due to the development env

Slide 47

Slide 47

USER TIMING API

Slide 48

Slide 48

pe rf or ma nc e. m ar k( S TA RT _M AR K) ; USER TIMING API

Slide 49

Slide 49

pe pe rf rf or or ma ma nc nc e. m e. m ar ar k( E k( S ND TA _M RT AR _M K) ; K) ; AR USER TIMING API

Slide 50

Slide 50

AR _M ND k( E ar e. m nc ma or rf pe pe rf or ma nc e. m ar k( S TA RT _M AR K) ; K) ; USER TIMING API performance.measure(MEASURE_NAME, START_MARK, END_MARK);

Slide 51

Slide 51

pe rf or ma nc e. m ar k( S TA RT _M AR K) ; USER TIMING API performance.measure(MEASURE_NAME, START_MARK);

Slide 52

Slide 52

USER TIMING API performance.measure(MEASURE_NAME);

Slide 53

Slide 53

USER TIMING API CROSS-COMPONENT Marks can be placed and measured anywhere on the page NOT VUE-SPECIFIC Can be added to any JS

Slide 54

Slide 54

DEVTOOLS ARE NOT SCALABLE Common sense

Slide 55

Slide 55

MEASURE • MONITOR • OPTIMISE Page Loading Time User Timing API

Slide 56

Slide 56

  1. MONITOR

Slide 57

Slide 57

MONITOR

Slide 58

Slide 58

MONITOR ANALYTICS PERFORMANCE

Slide 59

Slide 59

MONITOR ANALYTICS TOOLS Google Analytics Snowplow etc.

Slide 60

Slide 60

MONITOR PERFORMANCE TOOLS Webpagetest Pingdom sitespeed.io SpeedCurve etc.

Slide 61

Slide 61

MONITOR ANALYTICS PERFORMANCE SNOWPLOW SITESPEED.IO

Slide 62

Slide 62

MONITOR SNOWPLOW https://docs.gitlab.com/ee/telemetry/index.html • Clicking links or buttons. • Submitting forms. • Other typically interface-driven actions

Slide 63

Slide 63

MONITOR SNOWPLOW https://docs.gitlab.com/ee/telemetry/index.html • Event tracking: https://docs.gitlab.com/ee/telemetry/index.html • Snowplow tracking guide: https://docs.gitlab.com/ee/telemetry/snowplow.html#frontendtracking • Enable Snowplow tracking in Admin Area > Settings > Integrations • Add tracking to your code

Slide 64

Slide 64

MONITOR PERFORMANCE TOOLS

Slide 65

Slide 65

Slide 66

Slide 66

Slide 67

Slide 67

MONITOR SNOWPLOW (ex-Periscope)

Slide 68

Slide 68

MONITOR SNOWPLOW PROS: CONS: • Monitors exactly what you send to it • Monitors exactly what you send to it • Metrics won’t work with DNT setting • Requires analytics-specific code

Slide 69

Slide 69

MONITOR ANALYTICS PERFORMANCE SNOWPLOW SITESPEED.IO

Slide 70

Slide 70

MONITOR SITESPEED https://gitlab.com/gitlab-org/frontend/sitespeed-measurement-setup

Slide 71

Slide 71

MONITOR SITESPEED Grafana

Slide 72

Slide 72

MONITOR SITESPEED PROS: CONS: • Dedicated performance tool • No additional code required • Setting up Docker to run against localhost or local host (defined in /etc/hosts) is a nightmare

Slide 73

Slide 73

MONITOR ANALYTICS PERFORMANCE RETROSPECTIVE

Slide 74

Slide 74

MONITOR PROACTIVE Practical part

Slide 75

Slide 75

context ‘Frontend Performance’ do let(:file_name) { ‘popen.rb’ } let(:content) { project.repository.blob_at(‘master’, ‘files/ruby/popen.rb’).data } before do stub_feature_flags(snippets_vue: true) visit snippet_path(snippet) # wait_for_requests sleep 5 end it ‘starts rendering snippet within 0.5 seconds +-20% percent’ do expect(page.evaluate_script(‘window.performance.getEntriesByName(“vue-start”)[0].startTime/ 1000’)).to be_within(0.125).of(0.5) end it ‘renders full snippet within 2 seconds +-20% percent’ do expect(page.evaluate_script(‘window.performance.getEntriesByName(“content-full”)[0].duration/ 1000’)).to be_within(0.5).of(2.0) end end

Slide 76

Slide 76

MEASURE • MONITOR • OPTIMISE Page Loading Time Sitespeed User Timing API Snowplow

Slide 77

Slide 77

  1. OPTIMISE

Slide 78

Slide 78

IS IT NEEDED?

Slide 79

Slide 79

IS IT NEEDED? WATCH CLOSELY & GET READY

Slide 80

Slide 80

OPTION #1 OPTION #2 1.6 SECONDS 2.0 SECONDS

Slide 81

Slide 81

  1. PERFORMANCE BUDGET INSTANT (0.1—0.2s) IMMEDIATE (0.5—1s) USER FLOW (2—15)

Slide 82

Slide 82

  1. THE NEED OPTIMISE TO IMPROVE BAD METRICS

Slide 83

Slide 83

OPTION #1 OPTION #2 1.6 SECONDS 2.0 SECONDS

Slide 84

Slide 84

OPTION #1 OPTION #2 400 milliseconds 1.6 SECONDS 2.0 SECONDS

Slide 85

Slide 85

WEBER-FECHNER LAW JUST NOTICEABLE DIFFERENCE (JND) EVENT

Slide 86

Slide 86

WEBER-FECHNER LAW JUST NOTICEABLE DIFFERENCE (JND) EVENT 20%

Slide 87

Slide 87

JUST NOTICEABLE DIFFERENCE

Slide 88

Slide 88

T O NOTICEABLE N JUST Y L I R L A U S F S G E DIFFERENCE N C E NI N EA M

Slide 89

Slide 89

  1. CHASING THE LEADER OPTIMISE TO SURVIVE COMPETITION

Slide 90

Slide 90

  1. CHASING THE LEADER YOU YOU WISH

Slide 91

Slide 91

  1. CHASING THE LEADER GEOMETRIC BISECTION

Slide 92

Slide 92

  1. CHASING THE LEADER GEOMETRIC BISECTION

Slide 93

Slide 93

  1. CHASING THE LEADER GEOMETRIC BISECTION

Slide 94

Slide 94

  1. CHASING THE LEADER GEOMETRIC BISECTION 3.2S

Slide 95

Slide 95

IS IT NEEDED?

  1. PERFORMANCE BUDGET 2. THE NEED 3. CHASING THE LEADER

Slide 96

Slide 96

#thanks TIME SPENT EDUCATING ME DURING THE COFFEE-CHATS VALUABLE COMMENTS AND HELP #GitLabContribute RAMYA AUTHAPPAN · ALEX BUIJS · JEFF CROW JEROME NG · JEREMY JACKSON · GEORGI N. GEORGIEV PEDRO POMBEIRO · VIJAY HAWOLDAR · ASH MCKENZIE