REMINDER
blog posts are not being indexed - we need the blogposts to be attached to our custom domains and not appdrag
-
hey there guys
I have tried crawling my site manually for a week with 2 pro site search tools and the blog posts are not included. I have updated the posts, published the site and manually updated the sitemap a lot but still no luck. the two normal pages are picked up instantly, but not the blog posts which is peculiar because the blog list is on the index.html. It just dawned on me when reviewing my blog posts that they still have the appdrag url. Are you guys working on giving the blog posts our own custom domains?kan we manually update our robot.txt and sitemap?
thanks in advance
Linda -
@Linda-MacDonald
There's multiple site-maps in the robots.txt file, you might want to add https://yoursite.com/sitemap-blog.xml
to the search engines (if you haven't already). -
@ThomasD thanks a mil for getting back to me but I am also thinking that the team needs to address this because if you take a look at your blog permalink it's still on appdrag...and not on your custom domain so yup need to add xml to search engines, but we also need our blog content to be given our cusrtom domain url
-
@Linda-MacDonald If you have added a real domain, then it should show in robots.txt
-
@ThomasD thanks a mil for your advice. Am gonna fix my xml file just as soon as the team takes a look at my code-editor which at the moment has no save button
-
Linda, you just have to save your domain as the main domain in your project dashboard. Then publish your website and your sitemaps will point to your domain and not AppDrag
-
@Wassim there are some issues here that you guys might want to address.
A: yup everything points to my custom domain except the permalink section in the blog section - those still say app drag. Also, I am guessing the blogs are served through some sort of javascript script? Search engines notoriously handle javascript badly.
A: there is a sitemap.blog.xml for the blogposts - those won't be registered by e.g. google. Most search engines only accept the sitemap.xml and won't crawl any other XML files. I had a word with the team at search 360 yesterday and they cannot see my blogs in the XML - the reason being the file-name is sitemap.blog.xml and not sitemap.xml. The same for google. And I seemingly cannot edit my xml files, but this might be because I cannot see anything in my header. I cannot see functionality for establishing new projects either at the moment - please see my screenshot
https://www.awesomescreenshot.com/image/5989919?key=ddcb022e7ef546b991a25fcadaa66588
-
A: The blogs are rendered in HTML on the "real" URLs to the blog pages, see the links in the blog sitemap for that.
A: (B?): The sitemaps are correctly linked in the robots.txt files on the domains, the search engines will for sure pick those up. Don't stare yourself blind on a single sitemap file. Also, like I wrote in the other thread, you should always add the sitemaps to the search engine consoles/tools even if they are picked up automatically eventually.