Sustainable Reporting - Emissions
Tracking carbon emissions in end-to-end tests.
This tooling used in this report has been superseded by the npm package @danhartley/emissions [24 July 2024].
Measuring carbon emissions associated with websites and apps is in its infancy but there are benefits to doing so.
The first is an appreciation of how emissions relate to bytes transferred.
The second is seeing how emissions fluctuate in response to changes in design and code. This can be done by recording emissions during end-to-end (E2E) tests.
Emissions per byte
In order to measure carbon emissions for this website, I'm using CO2.js from The Green Web Foundation (GWF). The simplest call to their API needs only a byte value.
If the site is hosted on servers running on renewable energy the emissions will be lower. The GBF provides a helper function for checking if a site is green hosted.
Here is an example for a website of median page weight.
import { hosting, co2 } from "@tgwf/co2"
const green = hosting.check('median-website.com').greenconst bytes = 2299const emissions = new co2().perByte(bytes, green)
Green web host | Carbon dioxide emissions |
---|---|
Yes | ~783 mg/CO2 |
No | ~905 mg/CO2 |
End-to-end testing
In order to record requests and calculate emissions, I created a simple helper class which I have since converted into an npm library - @danhartley/emissions.
A call to the instance's single public method returns a summary of page metrics relevant to its emissions including:
- The number of requests
- Whether the site uses green hosting
- The grid intensity (either calculated from the request or set manually)
- The total transfer size of requests in kilobytes (kBs)
- How long the page took to load in milliseconds (ms)
- Emissions in milligrams of carbon dioxide
These values can be persisted and used to monitor the effect of code or design changes.
I also created a test that can report on any website (it simply loads the given page). By default I scroll to the bottom of the page which may give a more honest account of what is happening.
node emissions-tracker/emissions-by-url.js -u https://www.theguardian.com/uk -v -lh
// -u website url or domain// -v: verbosity// -lh: to also run a Lighthouse report
The test environment is Chrome under the control of Puppeteer.
Report
Here are the results from a few sites including this one.
- The GWF
- The Green Web Foundation
- The PG
- The Public Good
- The Guardian
- The Guardian newspaper
- iNaturalist
- iNaturalist
Wesbite | Requests | kBs | ms | mg/CO2 |
---|---|---|---|---|
The GWF | 24 | 533 | 938 | 137 (45) |
The PG | 86 | 343 | 983 | 99 (35) |
The Guardian | 110 | 2291 | 1533 | 897 (828) |
iNaturalist | 59 | 2798 | 3906 | 771 (664) |
Here is a comparison with the results of some popular online carbon calculators. All the values given using the Emissions Tracker are based on an automated scroll to the bottom of the page. The emissions values without scrolling are given in brackets. This distinction is important and one often ignored or not explored.
Wesbite | mg/CO2 | |||
---|---|---|---|---|
- | ET | EG | CN | WC |
The GWF | 137 (45) | 50 | 14 | 40 |
The PG | 99 (35) | 90 | 9 | 60 |
The Guardian | 897 (828) | 1069 | 296 | 1060 |
iNaturalist | 771 (664) | 570 | 259 | 630 |
We can compare the bytes transferred value of the Emissions Tracker (ET) with Chrome DevTools (DT) and Lighthouse (LH).
N.B. Figures for Lighthouse are low for requests because only the page above the fold is analysed. When I run the Emissions Tracker, I scroll smoothly to the end of the page.
Website | Kilobytes | ||
---|---|---|---|
- | ET | DT | LH |
The GWF | 533 | 508 | 141 |
The PG | 343 | 224 | 169 |
The Guardian | 2291 | 2200 | 2259 |
iNaturalist | 2798 | 2400 | 2925 |
The number of requests.
Website | Requests | ||
---|---|---|---|
- | ET | DT | LH |
The GWF | 24 | 21 | 23 |
The PG | 86 | 56 | 49 |
The Guardian | 110 | 114 | 123 |
iNaturalist | 59 | 44 | 57 |
And load time.
Website | Load (ms) | ||
---|---|---|---|
- | ET | DT | LH |
The GWF | 938 | 1002 | 983 |
The PG | 983 | 877 | 1345 |
The Guardian | 1533 | 1240 | 1480 |
iNaturalist | 3906 | 6000 | 3931 |
There are discrepancies in the results. There are discrepancies between runs using the same measure. One of the largest - an order of magnitude - is between the results given by the Carbon Neutral website and other calculators. The only calculator that gives similar results is GreenFrame but that requires local installation or a subscription. Another site, ecoIndex, gives values an order of magnitude higher than the average. For example, for the Public Good, the emissions are calculated to be 1620 m/CO2.
Until there is consensus and certainty around numbers, I think we should be careful publishing them. It is too often the case that a single figure gets picked up and repeated endlessly, as happened with carbon emissions attributed to watching Netflix.
Living information architecture
The greatest benefit to me from measuring bytes and emissions was that I paid more attention to code running where others see and interact with it - in the browser.
The Information Architecture (IA) of a web site and how individuals navigate it affects bytes transferred and processing time and resources.
For example, this blog preloads linked pages. The home page has a lot of internal links which significantly increase its page weight (a mix of the number of bytes transferred and the number of requests).
But if you click on a visible link (above the fold) to internal content, you will see that page loads almost instantly with very few bytes being transferred.
I also cache pages. If you return to a page it is served from a local cache and the only network traffic will be to third parties (such as cabin analytics).
The effectiveness of this strategy depends on how people use the site; whether they move between pages, or indeed whether they read more than one article in a single session. It's quite possible they won't read more than one but they may jump from page to page.
To observe this behaviour, open the Network tab in the developer tools of any browser.
Summary
Recording performance and sustainability metrics during end-to-end tests in (or similar to) the production environment is a way for developers to get closer to the experience of people using their site. It is more meaningful to engage with the living information architecture in the browser than dead artefacts.
Emissions tracking and digital sustainability in general cannot be viewed, however, in isolation, but should be considered alongside accessibility, performance, ethical factors and security.
And whilst some comparison with similar sites is useful, most is gained by observing change within a site. The best feature is sometimes the one that doesn't get built.
A snapshot of a page or even the entirety of a site is insufficient to judge its merit. The platform for recording observations of nature, for example, iNaturalist, scores poorly in some respects but the site has changed little in fifteen years, a testament to good information architecture and clear aims.
Conclusion
Knowing how and when bytes are transferred and emissions accrued helps developers make good decisions about site architecture, especially as a site is modified and extended and code is refactored.
I was less convinced displaying emissions would be useful for site visitors until I checked what happened when I scrolled through the posts on my Facebook home page. Since posts are added quicker than I can scroll, this is literally infinite. The initial page load was about 9MBs. After one minute of scrolling, I had downloaded:
174MBs
I would like to see native emissions counters in browsers that aggregate emissions across sites. Whether this would be over a session or time interval would be up to us, as would the option to set a budget or cap on emissions. It would certainly help highlight the deleterious effect of devious and deceptive patterns like infinite scroll and video autoplay.
Finally, emissions reflect only a fraction of a website's impact on its environment. A full Digital Life Cycle Assessment (DCLA) would be needed to take into account water and land usage and adverse effects on people and nature to name only a few considerations.
Appendix
Content length
When the content length is unavailable, I use the response byte length. However, this is the uncompressed value, and compression ratios are variable.
In order to compensate, I set default ratios for CSS (6), JS (2) and Other (5). These values can be overridden using command line variables.
Performance API
I initially used values from the Performance API (a more recent alternative is the PerformanceObserver API) but this returns a value of 0 bytes for requests to third parties.
When CORS is in effect, many of the timing properties' values are returned as zero unless the server's access policy permits these values to be shared. This requires the server providing the resource to send the Timing-Allow-Origin HTTP response header with a value specifying the origin or origins which are allowed to get the restricted timestamp values.
Links to external references
- Sustainability | HTTP Archive
- Digital Carbon Footprint: The Current State of Measuring Tools | marmelab
- Estimating Digital Emissions | Sustainable Web Design
- CO2.js - Overview | The Green Web Foundation
- Web Performance Recipes With Puppeteer | Addy Osmani
- Digital Life Cycle Assessments | Mightybytes
- Why We Don’t Report Website Carbon Emissions | DebugBear
- Measure and reduce your website's CO2 emissions | GreenFrame
- Why web perf tools should be reporting website carbon emissions | Fershad Irani