Dynamic pages not being indexed
this is an index of sub-pages, with dynamic URLs:
An array of real "dofollow" links is rendered to the search engine, but these sub-pages are not being indexed.
Any idea why?
Thanks in advance and
Make your website URLs to search engine friendly. A URL to be readable. And I think you need to check your sitemap once, and set the frequency to daily,but not yearly. Try submitting sitemap.xml file.
The URLs are all in this format:
Is that not search engine friendly.
I've now also installed a WP-plugin "XML sitemaps" that was missing, but don't know, whether it's smart enough to recognise the above links...
EDIT: I could see in the sitemap.xml that the update frequency was "yearly".
Unfortunately, the XML Sitemaps tool doesn't allow a change of frequency.
I read though, that it pings the file frequently, so I've deleted the sitemap.xml, hoping that it will be updated...
Am eager to see, whether it's smart enough to recognise dynamic pages...
Thanks very much for your reply!
Last edited by arvgta; 09-23-2014 at 10:49 AM.
I dont think so it is a search engine frinedly, try to have a link URL without ? or = .So that it can be easily readable by search engines.
There are so many tools available to generate a sitemap file, and where you can edit the file. I feel it would be good if you can set the frequency to Daily.
And I think you need to check your sitemap once, and set the frequency to daily,but not yearly. Try submitting Sitemap.xml file.
Thanks again for your reply.
A new sitemap.xml has been re-generated, unfortunately with those dynamic URLs missing. So the WP XML Sitemaps tool is not smart enough to generate the dynamic URLs. This would be the first step...
What's the correct WP way of doing this?
thanks for your contribution!
There's no option in the WP XML Sitemaps for setting the frequency of updates.
However, when the sitemap.xml is deleted it gets re-generated quite quickly, as was so last night.
The real problem is that the generator does not recognise dynamic URLs, on a special WP page template.
I am thinking of completely chassing the WP XML Sitemap generator and deleting the sitemap.xml for good? Then Google would probably index the sub-pages...
Or alternatively, it would be interesting to employ a WP plugin, that is a bit smarter? I've googled around but didn't find anything like it...
Last edited by arvgta; 09-24-2014 at 04:41 AM.
Isn't it possible to change your dynamic URLs into static in any way?
And for sitemap for dynamic URLs, I tried for creating sitemap on one sitemap generator tool, It could create for dynamic URLs too.
Please check the link: http://www.xml-sitemaps.com/details-....14667139.html, there you can download your sitemap.xml file, where your website dynamic URLs also added.
Excellent! Thanks very much!
The dynamic URLs are recognised by http://www.xml-sitemaps.com/ !
I've chucked out the XML Sitemap generator.
I've also created a Google Webmaster Tools entry for http://www.oeko-fakt.de/ and submitted the uploaded sitemap.xml.
Google seems to happy, the status is "pending", but I trust these pages will be spidered shortly.
I'll report back, as soon as they're spidered...
Thanks a million for your effort!
Ok, sitemap might be processed within 24hrs, keep checking in Google webmaster tools to know about status of your sitemap submission. And I hope you are done with all other on-page strategies to get more visibility on search engines.All the best.
You're a star! All the best, too!
unfortunately, over a month later, the problem persists.
Here's a link to the sitemap: http://www.oeko-fakt.de/sitemap.xml
(I've set the update frequency to "daily" everywhere, and re-submitted to Google)
So one culprit left is the URL structure, they are like this:
Are the above URLs valid?
What else could be the problem?
Thanks in advance
EDIT: Found this: http://www.searchenginejournal.com/f...url-structure/
...which indicates, that the URLs are not the problem
(i.e. they should be crawled, even if not ideal)
Any other ideas?
Uhm... from what I'm seeing every one of your anchors takes me to the exact same page just with a different URL/request. I'm not seeing any unique content anywhere on any of the pages? What exactly are you expecting search to do other than pimp-slap you for duplicate content?
Wait, do you have some form of scripttardery screwing with the anchors or something? I only get a different page if I click on the anchor (changing the URL) and then do a refresh... I bet that's what's happening; If my following your anchors isn't taking me to a different page, I bet the search engine is having the exact same issues...
Confirmed, disabling/blocking JS actually takes me to new pages. I'd say that Google's recent (past three to four years) changes of trying to obey when JS screws with navigation and/or generates content is what's making the page fail.
Really this is just another train wreck of how not to build a website; absolute URL's for nothing, endless pointless DIV and classes for nothing, and worst of all endless pointless code-bloat JS for nothing.
If I were to take a wild guess, I'd probably say it's that "ajaxify" nonsense mucking with you... which is the typical "I can haz intarnets" scripting garbage that wouldn't be needed on such a simple site if the markup, css, and other scripting wasn't such a bloated mess.
I'm usually grateful for most any answer but your's is so exaggeratedly negative, that I can't take it seriously as a whole...
Users Browsing this Thread
There are currently 1 users browsing this thread. (0 members and 1 guests)