Today I Learned - Rocky Kev

TIL about a a 12 year old bug that finally get's squashed.

POSTED ON:

TAGS:

This 12 year old wordpress bug, which was tweeted via Konstantin Kovshenin is finally getting squashed in WordPress 6.0! You can check out the bug ticket here.

Here's the tweet:

Is a static robots.txt file good for WordPress performance? Let's find out!

To display the default robots.txt, a fresh WordPress install will: 🧵

  1. Run 15 SQL queries: load all options, can_compress_scripts, WPLANG, query last 10 posts, query the recent posts widget options, query terms, taxonomies, etc., for found posts, all post metadata for found posts, recent comments widget, recent entries widget and a few others.

  1. Call file_get_contents() 61 times to register some core Gutenberg blocks, that's in addition to json_decode() each file, and about 100 calls to file_exists() on those files.

  1. Register 22 post types, 24 post statuses, 22 sidebar widgets, 12 taxonomies, 7 block pattern categories, 10 block styles, 3 image sizes.

  2. It will call gettext translations: 1917 times for regular strings, and 875 times for strings with context. I'm so lucky I'm using the default locale. Oh and exactly 0 of those strings are used in robots.txt.

  1. Check whether the front page has been set as a static page, and whether the request is_front_page() or is_home(). Also is_single(), is_feed(), is_admin(), is_category(), is_search(), the list goes on.

  2. Check whether the user is logged in, 14 times, and whether we need to display an admin bar, also heartbeat settings. It will also attempt to read the user session, and create 3 nonces. Reminder: this is an anonymous request.

  3. Escape some HTML 78 times. Reminder: robots.txt is a plain/text file, there's no HTML. It will check whether the admin needs to be forced SSL. It will also initialize smilies, and Twenty Twenty One "dark mode"
    Run 83 unique actions (one of them is do_robotstxt) and apply 530 unique filters (one of them is robots_txt).

All combined, that's > 42,000 function calls at 5.46 megs peak memory, about 100 ms wall time. So yes, by all means, please use a static robots.txt file.


What's more impressive is that 12 years ago, someone noticed a potential optimization issue, and the technical debt piled on thanks to new features, encapsulating them for reusability, then calling them over and over.

check whether the user is logged in, 14 times

I know i'm guilty of doing this in a bunch of functions. This really hammers on that we don't know what to optimize until the dust settles.

Overall, this update could potentially shave 100ms every page load! Which is really impressive from a big-picture sense, at it's only 100ms to run all that code, but by optimizing it, we save a MASSIVE 100ms on page load.


Related TILs

Tagged:

TIL Static Blocks vs Dynamic Blocks

A static block is a piece of content whose markup is known when the page is saved. The block saves its content and markup directly in the post content. A dynamic block is a piece of content whose markup and exact content are not known when the page is saved.

TIL how to convert a shortcode to a WP block

Traditionally, shortcodes were a way that plugin developers could provide users the ability to add specific plugin functionality anwhere on their site. But shortcodes are not very user friendly, nor was hunting down the relevant data you needed to render the correct data for the shortcode. Converting existing shortcodes to blocks provides a much greater user experience in all aspects.

TIL how WordPress does serverside

This isn't fully accurate, but for the means of describing server-side rendering, it's a good start!