Amp up for AMP Conf 2017

I can’t believe it’s already been over a year since we started our quest for faster, friendlier web pages. Now that we’re out of the honeymoon phase, the AMP team is taking a hard look at where we are today, what’s to come, and the direction of the AMP ecosystem.

It’s easy to blaze ahead with the development of new features, but it’s infinitely harder to create a healthy, breathing ecosystem. To do so, we want to continue to involve all of you – the AMP community – in figuring out the right path together. How better to kickstart things than to meet up?

With that, it’s my great pleasure to invite you to our first-ever AMP Conf. First, the basics:

  • March 7th and 8th
  • At the Tribeca 360 space in New York
  • Live-streamed around the world
  • Two full days of talks and panels
  • Targeted at web developers & designers

Whether you’re interested in or already building AMP pages, building a platform to display AMP content, or want to contribute to AMP itself (yes please!), we want you to participate. Request a seat to in person, or join via the live stream on YouTube.

Not only will the AMP team talk about new, exciting features and components – more than half of all talks and panels will be from you, members of the AMP ecosystem. We’ll discuss:

  • The challenges and wins of running AMP in production
  • How to create better, beautiful and interactive AMP pages
  • How your AMP pages are distributed across platforms
  • How to monetize AMP pages and the innovation happening around ads in AMP
  • How you can contribute to AMP

We’ll follow up with a more detailed conference schedule by the end of January, and if you have any questions not covered in the FAQ, reach out to me (or amp-conf-2017@google.com) anytime.

See you soon!

Posted by Paul Bakaus, AMP Developer Advocate, Google

Amp up for AMP Conf 2017

Why AMP Caches exist

The following was posted on Medium by Paul Bakaus, AMP Developer Advocate, Google.

Caches are a fundamental piece of the Accelerated Mobile Pages (AMP) Project, yet one of the most misunderstood components. Every day, developers ask us why they can’t just get their AMP pages onto some AMP surfaces (e.g. Google) without linking through the cache. Some worry about the cache model breaking the origin model of the web, others worry about analytics attribution and canonical link sharing, and even others worry about their pages being hosted on servers out of their control. Let’s look at all of these, and understand why the caches exist.

While AMP Caches introduce some trade-offs, they do work in the user’s favor to ensure a consistently fast and user-friendly experience. The caches are designed to:

  • Ensure that all AMP pages are actually valid AMP.
  • Allow AMP pages to be preloaded efficiently and safely.
  • Do a myriad of additional user-beneficial performance optimizations to content.

But first:

The Basics: Analytics attribution and link sharing

Even though the AMP Cache model doesn’t follow the origin model (serving your page from your own domain), we attribute all traffic to you, the publisher. Through the <amp-analytics> tag, AMP supports a growing number of analytics providers (26 to date and growing!), to make sure you can measure your success and the traffic is correctly attributed.

When I ask users and developers about why they want to “click-through” to the canonical page from a cached AMP result, the answer is often about link sharing. And granted, it’s annoying to copy a google.com URL instead of the canonical URL. However, the issue isn’t as large of a problem as you’d think: Google amends its cached AMP search results with Schema.org and OpenGraph metadata, so posting the link to any platform that honors these should result in the canonical URL being shared. That being said, there are more opportunities to improve the sharing flow. In native web-views, one could share the canonical directly if the app supports it, and, based on users’ feedback, the team at Google is working on enabling easy access to the canonical URL on all its surfaces.

With these cleared up, let’s dig a little deeper.

When the label says AMP, you get AMP

The AMP Project consists of an ecosystem that depends on strict validation, ensuring that very high performance and quality bars are met. One version of a validator can be used during development, but the AMP Cache ensures the validity at the last stage, when presenting content to the user.

When an AMP page is requested from an AMP Cache for the first time, said cache validates the document first, and won’t offer it to the user if it detects problems. Platforms integrating with AMP (e.g. Bing, Pinterest, Google) can choose to send traffic directly to the AMP page on the origin or optionally to an AMP Cache, but validity can only be guaranteed when served from the cache. It ensures that when users see the AMP label, they’ll almost always get a fast and user friendly experience. (Unless you find a way to make a slow-but-valid AMP page, which is hard, but not impossible… I’m looking at you, big web fonts).

Pre-rendering is a bigger deal than you think

If you take anything away from this post, it’s that pre-rendering, especially the variant in AMP, greatly outweighs any speed gains you could theoretically get by hosting directly from an origin server. Even if the origin server is closer to your users, which would shave off a few milliseconds — rare but possible — pre-rendering will almost certainly drive the most impact.

Perceived as much faster

In fact, pre-rendering can often save you seconds, not milliseconds. The impact of pre-rendering, as opposed to the various other performance optimizations in the AMP JS library, can be pretty dramatic, and contributes largely to the “instant-feel” experience.

cache_post

Very efficient compared to full pre-rendering

If that was the whole story, we could just as easily pre-render AMP pages from their origin servers. If we did, we couldn’t guarantee that a page is valid AMP on the origin, and valid AMP is critically important for the custom pre-rendering the AMP JS library provides: Pre-rendering in AMP, as opposed to just pre-rendering an entire page through something like link prefetching, also limits the use of the users’ bandwidth, CPU and battery!

Valid AMP documents behave “cooperatively” during the pre-render stage: Only assets in the first viewport get preloaded, and no third-party scripts get executed. This results in a much cheaper, less bandwidth and CPU-intensive preload, allowing platforms to prerender not just the first, but a few of the AMP pages a user will likely click on.

Safe to embed

Because AMP Caches can’t rely on browser pre-rendering (see the section above), normal navigation from page to page doesn’t work. So in the AMP caching model, a page needs to be opened inline on a platform page. AMP Caches ensure that the requested AMP page can do that safely:

  • Validator ensures no Cross-Site Scripting (XSS) in main document.
  • On top of the validator, the AMP Cache parses and then re-serializes the document in an unambiguous fashion (this means that it does not rely on HTML5 error correction). This ensures that browser parsing bugs and differences cannot lead to XSS.
  • The cache applies a Content Security Policy (CSP). This provides additional defense-in-depth against XSS attacks.

Additional privacy

In addition, the AMP Caches remove one important potential privacy issue from the pre-render: When you do a search on a content platform preloading AMP pages on the result page, none of the preloaded AMP pages will ever know about the fact that they’ve been preloaded.

Think about it this way: Say I search for “breakfast burrito”. If you know me well, you know I obviously searched for Parry Gripp’s song with the same name. But the search result page also shows me a couple of AMP search results from fast food chains that sell actual breakfast burritos. For the next month, I wouldn’t want to see actual breakfast burritos everywhere even though I didn’t click on these links (even though…maybe I do…mhh..), and an advertiser wouldn’t want to waste dollars on me for pointless re-marketing ads on all the burritos. Since AMP hides the preload from the publisher of the AMP page and related third parties, it’s a win win scenario for users and advertisers.

Auto-optimizations that often result in dramatic speed increase

The AMP Cache started out with all of the above, but has since added a number of transformative transformations (heh) to its feature roster. Among those optimizations:

  • Consistent, fast and free content delivery network for all content (not just big publishers).
  • Optimizes HTML through measures such as bringing scripts into the ideal order, removing duplicate script tags and removing unnecessary quotes and whitespace.
  • Rewrites JavaScript URLs to have infinite cache time.
  • Optimizes images (a 40% average bandwidth improvement!)

On the image compression side alone, Google, through its cache, is doing lossless (without any visual change, e.g. removes EXIF data) and lossy (without noticeable visual change) compression. In addition, it converts images to WebP for browsers that support it and automatically generates srcset attributes (so-called responsive images) if they’re not already available, generating and showing correctly sized images to each device.

Isn’t there a better way of doing this?

Look, I hear you. The provider of an AMP Cache is mirroring your content. It’s an important role and comes with great responsibility. If the cache provider were to do something truly stupid, like inserting obnoxious ads into every AMP page, AMP would stop being a viable solution for publishers, and thus wither away.

Remember, AMP has been created together with publishers, as a means to make the mobile web better for publishers, users and platforms. It’s why the AMP team has released strict guidelines for AMP Caches. To give you two interesting excerpts, the guidelines state that your content needs to provide “a faithful visual and UX reproduction of source document”, and cache providers must pledge that they will keep URLs working indefinitely, even after the cache itself may be decommissioned. These, and many more rules, ensure that a cache doesn’t mess with your content.

Most importantly, there’s plenty of room for more than one AMP Cache – in fact, Cloudflare just announced their own! With these AMP Cache guidelines released, other infrastructure companies are welcome to create new AMP Caches, as long as they follow the rules. It’s then up to the platform integrating AMP to pick their favorite cache.

From cache to web standards?

You just read about all the wins and trade-offs the AMP Caches do to provide an instant-feeling, and user friendly mobile web experience. What if we could get to many of the same awesome optimizations without the trade-offs, and without involving a cache at all?

Personally, I dream of future, still-to-be-invented web standards that would allow us to get there – to move beyond cache models (like a static layout system to know how a page will look like before any assets are loaded).

In 2016, we’ve done our first baby steps with the CPP, which turned into the Feature Policy: A way of saying things like “I disallow document.write on my site, and any third parties in any iframes that get loaded”. More advanced concepts like static layouting and safe prerendering require far-fetching changes to the web platform, but hey – just like forward time travel, it’s not impossible, just very, very difficult 🙂

Join me in figuring this out by getting in touch on Twitter or Slack, and know that I’ll always have an open ear for your questions, ideas and concerns. Onwards!

Posted by Paul Bakaus, AMP Developer Advocate, Google

Why AMP Caches exist

Introducing Accelerated Mobile Links: Making the Mobile Web App-Quick

The following was posted on Cloudflare’s blog by Matthew Prince, CEO, Cloudflare.

In 2017, we’ve predicted that more than half of the traffic to Cloudflare’s network will come from mobile devices. Even if they are formatted to be displayed on a small screen, the mobile web is built on traditional web protocols and technologies that were designed for desktop CPUs, network connections, and displays. As a result, browsing the mobile web feels sluggish compared with using native mobile apps.

In October 2015, the team at Google announced Accelerated Mobile Pages (AMP), a new, open technology to make the mobile web as fast as native apps. Since then, a large number of publishers have adopted AMP. Today, 600 million pages across 700,000 different domains are available in the AMP format.

The majority of traffic to this AMP content comes from people running searches on Google.com. If a visitor finds content through some source other than a Google search, even if the content can be served from AMP, it typically won’t be. As a result, the mobile web continues to be slower than it needs to be.

Making the Mobile Web App-Quick

Cloudflare’s Accelerated Mobile Links helps solve this problem, making content, regardless of how it’s discovered, app-quick. Once enabled, Accelerated Mobile Links automatically identifies links on a Cloudflare customer’s site to content with an AMP version available. If a link is clicked from a mobile device, the AMP content will be loaded nearly instantly.

Enabling Accelerated Mobile Links on Cloudflare

To see how it works, try viewing this post from your mobile device and clicking any of these links:

Accelerated Mobile Links animated demo

Increasing User Engagement

One of the benefits of Accelerated Mobile Links is that AMP content is loaded in a viewer directly on the site that linked to the content. As a result, when a reader is done consuming the AMP content closing the viewer returns them to the original source of the link. In that way, every Cloudflare customers’ site can be more like a native mobile app, with the corresponding increase in user engagement.

For large publishers that want an even more branded experience, Cloudflare will offer the ability to customize the domain of the viewer to match the publisher’s domain. This, for the first time, provides a seamless experience where AMP content can be consumed without having to send visitors to a Google owned domain. If you’re a large publisher interested in customizing the Accelerated Mobile Links viewer, you can contact Cloudflare’s team.

Innovating on AMP

While Google was the initial champion of AMP, the technologies involved are open. We worked closely with the Google team in developing Cloudflare’s Accelerated Mobile Links as well as our own AMP cache. Malte Ubl, the technical lead for the AMP Project at Google said of our collaboration:

“Working with Cloudflare on its AMP caching solution was as seamless as open-source development can be. Cloudflare has become a regular contributor on the project and made the code base better for all users of AMP. It is always a big step for a software project to go from supporting specific caches to many, and it is awesome to see Cloudflare’s elegant solution for this.”

Cloudflare now powers the only compliant non-Google AMP cache with all the same performance and security benefits as Google.

In the spirit of open source, we’re working to help develop updates to the project to address some of publishers’ and end users’ concerns. Specifically, here are some features we’re developing to address concerns that have been expressed about AMP:

  • Easier ways to share AMP content using publisher’s original domains
  • Automatically redirecting desktop visitors from the AMP version back to the original version of the content
  • A way for end users who would prefer not to be redirected to the AMP version of content to opt out
  • The ability for publishers to brand the AMP viewer and serve it from their own domain

Cloudflare is committed to the AMP project. Accelerated Mobile Links is the first AMP feature we’re releasing, but we’ll be doing more over the months to come. As of today, Accelerated Mobile Links is available to all Cloudflare customers for free. You can enable it in your Cloudflare Performance dashboard. Stay tuned for more AMP features that will continue to increase the speed of the mobile web.

Posted by Matthew Prince, CEO, Cloudflare

Introducing Accelerated Mobile Links: Making the Mobile Web App-Quick

Google AMP Cache, AMP Lite, and the need for speed

The following was posted on the Google Developers Blog by Huibao Lin and Eyal Peled, Software Engineers, Google.

At Google we believe in designing products with speed as a core principle. The Accelerated Mobile Pages (AMP) format helps ensure that content reliably loads fast, but we can do even better.

Smart caching is one of the key ingredients in the near instant AMP experiences users get in products like Google Search and Google News & Weather. With caching, we can make content be, in general, physically closer to the users who are requesting it so that bytes take a shorter trip over the wire to reach the user. In addition, using a single common infrastructure like a cache provides greater consistency in page serving times despite the content originating from many hosts, which might have very different—and much larger—latency in serving the content as compared to the cache.

Faster and more consistent delivery are the major reasons why pages served in Google Search’s AMP experience come from the Google AMP Cache. The Cache’s unified content serving infrastructure opens up the exciting possibility to build optimizations that scale to improve the experience across hundreds of millions of documents served. Making it so that any document would be able to take advantage of these benefits is one of the main reasons the Google AMP Cache is available for free to anyone to use.

In this post, we’ll highlight two improvements we’ve recently introduced: (1) optimized image delivery and (2) enabling content to be served more successfully in bandwidth-constrained conditions through a project called “AMP Lite.”

Image optimizations by the Google AMP Cache

On average across the web, images make up 64% of the bytes of a page. This means images are a very promising target for impactful optimizations.

Applying image optimizations is an effective way for cutting bytes on the wire. The Google AMP Cache employs the image optimization stack used by the PageSpeed Modules and Chrome Data Compression. (Note that in order to make the above transformations, the Google AMP Cache disregards the “Cache-Control: no-transform” header.) Sites can get the same image optimizations on their origin by installing PageSpeed on their server.

Here’s a rundown of some of the optimizations we’ve made:

1) Removing data which is invisible or difficult to see

We remove image data that is invisible to users, such as thumbnail and geolocation metadata. For JPEG images, we also reduce quality and color samples if they are higher than necessary. To be exact, we reduce JPEG quality to 85 and color samples to 4:2:0 — i.e., one color sample per four pixels. Compressing a JPEG to quality higher than this or with more color samples takes more bytes, but the visual difference is difficult to notice.

The reduced image data is then exhaustively compressed. We’ve found that these optimizations reduce bytes by 40%+ while not being noticeable to the user’s eye.

2) Converting images to WebP format
Some image formats are more mobile-friendly. We convert JPEG to WebP for supported browsers. This transformation leads to an additional 25%+ reduction in bytes with no loss in quality.

3) Adding srcset

We add “srcset” if it has not been included. This applies to “amp-img” tags with “src” but no “srcset” attribute. The operation includes expanding “amp-img” tag as well as resizing the image to multiple dimensions. This reduces the byte count further on devices with small screens.

4) Using lower quality images under some circumstances

We decrease the quality of JPEG images when there is an indication that this is desired by the user or for very slow network conditions (as part of AMP Lite discussed below). For example, we reduce JPEG image quality to 50 for Chrome users who have turned on Data Saver. This transformation leads to another 40%+ byte reduction to JPEG images.

The following example shows the images before (left/top) and after (right/bottom) optimizations. Originally the image has 241,260 bytes, and after applying Optimizations 1, 2, & 4 it becomes 25,760 bytes. After the optimizations the image looks essentially the same, but 89% of the bytes have been saved.

AMP Lite for Slow Network Conditions

Many people around the world access the internet with slow connection speeds or on devices with low RAM and we’ve found that some AMP pages are not optimized for these severely bandwidth constrained users. For this reason, Google has also launched AMP Lite to remove even more bytes from AMP pages for these users.

With AMP Lite, we apply all of the above optimizations to images. In particular, we always use lower quality levels (see Bullet 4 above).

In addition, we optimize external fonts by using the amp-font tag, setting the font loading timeout to 0 seconds so pages can be displayed immediately regardless of whether the external font was previously cached or not.

AMP Lite is rolling out for bandwidth-constrained users in several countries such as Vietnam and Malaysia and for holders of low ram devices globally. Note that these optimizations may modify the fine details of some images, but do not affect other parts of the page including ads.

* * *

All told, we see a combined 45% reduction in bytes across all optimizations listed above.
We hope to go even further in making more efficient use of users’ data to provide even faster AMP experiences.

Google AMP Cache, AMP Lite, and the need for speed

AMP Roadmap Update for End-Q4 2016

With the arrival of 2017, it’s time to review some launches from Q4 and projects started over the past few months that will continue into the new year. We’ve updated the AMP Roadmap to provide a more detailed reflection of what’s summarized below.

Launches in Q4

This quarter we’ve seen AMP format capabilities enhanced with support for forms with <amp-form>, app-install promotions with <amp-app-banner>, and muted autoplay for amp-video and amp-youtube. In addition, we’ve launched enhancements to AMP-in-PWA support, including progressive enhancement of AMP documents by Progressive Web App (PWA), and a reference implementation using AMP within a PWA.

Ads have gained support for multi-size ad requests, “flying carpet” ads, and improved user experience for amp-sticky-ad. Meanwhile, AMP for Ads (A4A) continues to ramp up ad delivery of AMP format creatives by supported ad servers like DoubleClick, TripleLift and Adsense.

Analytics support also expanded based on community feedback. Triggers are now available for use with the amp-carousel and amp-form features.

Work in progress

Going into the new year, we still have work ahead on many of the projects started in Q4 or earlier, so we’ll continue to focus on those in the months ahead.

We are currently working on a mechanism to bind element behavior to user actions, which will enable further types of interactive experiences on AMP pages. We also want to provide a collection of quick-start sample code (“AMP Start”) to make it easier for developers to create great-looking AMP pages.

Additionally, we’re working on enabling a richer e-commerce experience, including support for product galleries and tabbed content navigation, as well as better e-commerce analytics.

Continuing on the analytics side, we also want to build native amp-analytics support for user interactions with video players as well as more fine-grained features for analytics like support for variable filters.

Also on the horizon are improved UX for ad loading and continued investment in A4A, increasing the number of supported ad networks and contexts, so even non-AMP pages can display AMP ads.

* * *

Thanks to the AMP development community for your work and feedback. As always, please let us know if you have any issues or feature requests.

Posted by Eric Lindley, Product Manager, AMP Project

AMP Roadmap Update for End-Q4 2016