♥JcMagpie Posted July 24, 2018 Share Posted July 24, 2018 Why? By placing a formatted xml file with site map on your webserver, you enable Search Engine crawlers (like Google) to find out what pages are present and which have recently changed, and to crawl your site accordingly. This done is using a script by Written by iProDev(Hemn Chawroka) all credit to him for the original code. This is purely for the search engines. You can find a number of add-ons that can be installed to do this for you. This is a simple manual method. Simply add 2 files to your site in root. / sitemap-generator.php / simple_html_dom.php Then simply call “sitemap-generator.php” from the browser and its done. In your browser go to your site and enter https://www-yoursite-com/sitemap-generator.php and hit enter and hit enter. That's it. The sitemap.xml file will be placed at http://example-com/sitemap.xml.. You can then let the robot.txt file take care of telling browsers where to find the file or manually submit it to Google if you have an account. For robot.txt simply add this to the end of the file, Sitemap: https://www-yourwebsite-info/sitemap.xml” I only update my sites 3 to 4 times a year so this simple approach works fine for me plus one less add-on to worry about. If you need you can automate this by using a simple cron job. JcM sitemap xml.zip Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.