By default WordPress dynamically generates upto 50000 URL entries per individual wp-sitemap.xml file (large sites) If you're using shared web hosting or using a budget Dedicated Server/VPS the generation of a 50k URLs wp-sitemap.xml can take anything from ~2 seconds on a fast hosting provider to 10 seconds on a budget provider.
The "Couldn't fetch" in Google Search Console
Anything above ~5 seconds will be problematic for reading by search engines such as Google. One telltale sign of this is the "Couldn't fetch" in Google Search Console.
You can lower the number of entries generated with adding a simple PHP function to your theme functions.php. I recommended adding the function to the very bottom of your functions.php theme file, leaving the comments intact for increased readability (especially in a cluttered functions.php file).
Add this function to your functions.php
Once added, locate the WordPress XML Sitemap, usually found at example.com/wp-sitemap.xml (sitemap index). Visit one of the indivual sitemap files. If all went to plan it should resemble this (image below):
Notes:
I have implemented this on the Core version of Wordpress, meaning no fancy bloatware SEO plugins installed such as Yoast, Rankmath or All-in-One SEO Pack. You should refer to their documentation should you wish to try out the above. The function above may work for Yoast, etc. It's just I haven't tested them.
Inspiration taken from this article by WP Karma - https://wp-kama.com/handbook/sitemap
It's also worth noting that the core out-of-the-box XML sitemap functionality for WordPress doesn't include the last mod date or the priority.
The approach above from my standpoint has been purely for testing and problem solving purposes. I have been generating millions of URLs for sandbox/testing using the WordPress Rest API. You can read about that here: https://chrisleverseo.com/t/python-...-from-a-csv-file-using-wordpress-rest-api.93/
Tip:
I've found getting the new dynamically generated XML sitemaps to load even quicker, was to load them up in your browser to allow the CDN/Caching Plugin. Then submit/resubmit to Google Search Console.
The "Couldn't fetch" in Google Search Console
Anything above ~5 seconds will be problematic for reading by search engines such as Google. One telltale sign of this is the "Couldn't fetch" in Google Search Console.
You can lower the number of entries generated with adding a simple PHP function to your theme functions.php. I recommended adding the function to the very bottom of your functions.php theme file, leaving the comments intact for increased readability (especially in a cluttered functions.php file).
Add this function to your functions.php
PHP:
// edit xml sitemap URL count
// Limit URL generation count per xml sitemap.
add_filter( 'wp_sitemaps_max_urls', 'xml_sitemap_max_urls', 10, 2 );
function xml_sitemap_max_urls( $num, $object_type ){
// Set count number below, max number 50000
return 4000;
}
Once added, locate the WordPress XML Sitemap, usually found at example.com/wp-sitemap.xml (sitemap index). Visit one of the indivual sitemap files. If all went to plan it should resemble this (image below):
Notes:
I have implemented this on the Core version of Wordpress, meaning no fancy bloatware SEO plugins installed such as Yoast, Rankmath or All-in-One SEO Pack. You should refer to their documentation should you wish to try out the above. The function above may work for Yoast, etc. It's just I haven't tested them.
Inspiration taken from this article by WP Karma - https://wp-kama.com/handbook/sitemap
It's also worth noting that the core out-of-the-box XML sitemap functionality for WordPress doesn't include the last mod date or the priority.
The approach above from my standpoint has been purely for testing and problem solving purposes. I have been generating millions of URLs for sandbox/testing using the WordPress Rest API. You can read about that here: https://chrisleverseo.com/t/python-...-from-a-csv-file-using-wordpress-rest-api.93/
Tip:
I've found getting the new dynamically generated XML sitemaps to load even quicker, was to load them up in your browser to allow the CDN/Caching Plugin. Then submit/resubmit to Google Search Console.
Last edited: