Author |
Message |
kguske
Site Admin
Joined: Jun 04, 2004
Posts: 6433
|
Posted:
Sun Jan 13, 2008 3:28 pm |
|
I am doing some research to determine what features to emphasize in the update of nukeSEO. The initial info is very interesting and impacts several areas that can't be influenced by a nukeSEO-like tool.
The general consensus among SEO experts (not sure how that's defined, but I don't disagree with the list) is that one factor, HTML validation, has little or no impact on search engines.
Comments from the experts:
Adam Lasnik (Google) mentioned that rewarding validation & accessibility of documents would be a 'slippery slope.'
Mike McDonald: "Walking under ladders, breaking mirrors and stepping on cracks probably has more influence on your SERPs than validation."
One did comment that if invalid code block a spider from indexing your site, it could hurt your rankings.
Of course, I'm not suggesting that we not try to make things compliant. I just wanted to point out that it might not impact SERP.
Thoughts? |
_________________ I search, therefore I exist...
nukeSEO - nukeFEED - nukePIE - nukeSPAM - nukeWYSIWYG |
|
|
|
warren-the-ape
Worker
Joined: Nov 19, 2007
Posts: 196
Location: Netherlands
|
Posted:
Sun Jan 13, 2008 4:25 pm |
|
(x)html/css/etc validation is just a tool and should never be the goal on itself.
I can create 100% validated pages which suck absolutely donkey balls in terms of accessibility and wont use any logical markup at all (i.e. a <div> soup).
So I defenitaly agree with;
Quote: | HTML validation, has little or no impact on search engines. |
It changes (a bit) when we actually start using logical markup and make use of the most logical tags and elements (building semantic webpages). But even then... most of the SEO experts havent noticed any major differences in pagerankings.
If you read all the various SEO blogs and papers they always tell that 'content is king', no matter what fancy metadata you use, or how semantic your page is. If your content isnt original (and optimized) other factors become almost irrelevant.
Page titles have a big impact as well, which is why i was very happy to find Montego's Dynamic Titles (and what me brought here).
I personally would really like to see a 'dynamic description' variant which uses the content of a post/reply or article to generate it. Montego already told me that something like that will be produced in the near future and would be part of nukeSEO, so i cant wait |
|
|
|
|
kguske
|
Posted:
Sun Jan 13, 2008 5:19 pm |
|
Dynamic META tags (titles, descriptions, keywords) are definitely on the hit list for the next version of nukeSEO - as well as the ability to override them for a specific content item. |
|
|
|
|
Susann
Moderator
Joined: Dec 19, 2004
Posts: 3191
Location: Germany:Moderator German NukeSentinel Support
|
Posted:
Sun Jan 13, 2008 6:41 pm |
|
Don´t forget the multiligual function because Nuke is more or less multilingual.
I mean there are many sites which have installed several languages and publish content in different languages but there isn´t any option to use different metas for each language.
Or should we have all the same language in the near future ?
Back to your subject: There wasn`t any valide site on my first webpage beneed the index. But it was an authority site for a long time.
My site is now valid and my PR dropped twice but I still get 10 - 25 new users daily. So the question is ? What counts really ?
PR is for your own ego and Seo is like fever. Search engine optimazion is a daily and often "dirty" fight.
I don´t have fever anymore. Mabe it comes back one day.
I just like my W3C conform sites and that is something what counts for me. |
|
|
|
|
Guardian2003
Site Admin
Joined: Aug 28, 2003
Posts: 6799
Location: Ha Noi, Viet Nam
|
Posted:
Sun Jan 13, 2008 7:14 pm |
|
Yes, multi-lingual meta tags is going to be a tricky one, although there is a very basic way of doing it by retrieving the browser language and then displaying the appropriate data.
Getting back to the question in hand;
I would *guess* that optimised (compliant) code would be more favorable than code that isn't.
A search engine doesn't see a web page in quite the same way we do so I would *guess* that it might find one page *better* than another in the same way that not all pages are cross browser compatible and thus might present the data differently or quicker.
I'm not saying that it might have a bearing on your SERP but who knows, it might be possible - though it would be miniscule.
For me, the question is moot. If you can do it 'right' why do it sloppy?
I created two plain HTML sites in the last couple of weeks. Both use the same keywords and meta data. Both have the same keywords in their web address, both were promoted in the same places, both have site maps etc etc etc. One is XHTML 1 Strict, valid CSS1 and is the second link on page two of Google, the other I threw together and am in the process of 'tweaking' and that ones on page 15 lol so who knows. |
|
|
|
|
kguske
|
Posted:
Mon Jan 14, 2008 11:56 am |
|
Multi-language support is on the drawing board, but not likely for the next release.
Enjoyed the comments regarding PR ego and SEO fever! |
|
|
|
|
Susann
|
Posted:
Mon Jan 14, 2008 1:07 pm |
|
I tell you always the truth. |
|
|
|
|
montego
Site Admin
Joined: Aug 29, 2004
Posts: 9457
Location: Arizona
|
Posted:
Mon Jan 14, 2008 2:21 pm |
|
I agree. HTML/XHTML compliance has very little to do with SERP, and, IMO, was never really the reason to go down that path to begin with. However, of course, I also agree that if it breaks the ability of a spider to crawl, for whatever reason that might be, that is not a good thing.
IMO, compliance is important in terms of user agent rendering consistency and also the ability of the user agent to render the content and formatting quicker (e.g., it knows image sizing up front and can "reserve" the container on the screen for it while the image is being downloaded in the background, or having to figure out how to render when tags are missing - only a few examples).
It is also important, IMO, for accessibility, but I think this could also be covered under the more general topic of "user agent rendering consistency" as something like a screen reader user agent still has to "render" the content, just not on a visual screen per se.
Just my 2 cents worth on the "original topic" |
_________________ Where Do YOU Stand?
HTML Newsletter::ShortLinks::Mailer::Downloads and more... |
|
|
|
kguske
|
Posted:
Mon Jan 14, 2008 5:35 pm |
|
Curious to know what other thoughts (aside from the previously mentioned dynamic META tags) are on features and / or factors that affect SERP... |
|
|
|
|
montego
|
Posted:
Tue Jan 15, 2008 6:24 am |
|
Unfortunately, all of my readings point to nothing of major significance (other than what has already been noted), but lots of little things which "may" add up to something of significance.
For example, moving real content up higher within the HTML. This would require that themes be re-written as well as some of the RN/nuke code to have the content be structurally up close to the <body> tag as possible, but yet visually retain left-hand and right-hand blocks layouts.
It may still be valuable to shorten links to .html links. In fact, it would even be better if the link could be more descriptive and that it includes 1 or 2 keywords BUT, as always, one has to tread lightly as everything needs to have the "appearance" of being useful to the end-user. I.e., one does not want to "trip" any penalizers (like my new coined word? lol).
I think, though, that there is also another way of looking at this and that is from the perspective of "what not to do". However, that is also a long list of potential "penalizers".
It seems to me that you are addressing the key items. |
|
|
|
|
Guardian2003
|
Posted:
Tue Jan 15, 2008 9:52 am |
|
As far as nukeSEO (the module) goes, I'm with Montego.
There isn't much more you can do with META data from an SEO stand point once you have the 'dynamic' and 'language aware' functionality.
What's needed after that is a way to integrate nukeSEO with dynamic titles and ShortLinks to produce a short keyword relevant url - the obstacle being that once a url is 'crafted' it can no longer be dynamic otherwise you'll start getting 403's.
HOWEVER - I have seen some clever use of this! The 'script' pulls all the relevant data (much like Montego's HTML Newsletter) then stores the whole shebang as plain html by writing it out to a static html file with the requisite file name.
Personally I wouldn't go that far - it must be a nightmare given the possibility of duplicate content but that said, it might be handy for content that doesn't change like Content Module or Tutorials but it kind of defeats the whole purpose of having a dynamic site doesn't it?
From a *generic* SEO perspective nuke fights us all the way. There is no way to make use of image alt tags without editing the files directly. News article 'Description' is displayed in a H1 tag so unless you have your keyword at the beginning of every news item description you are 'watering down' your keyword density and the list goes on... |
|
|
|
|
kguske
|
Posted:
Tue Jan 15, 2008 8:40 pm |
|
Thanks for the encouragement... |
|
|
|
|
montego
|
Posted:
Wed Jan 16, 2008 4:21 pm |
|
Guardian2003 wrote: | News article 'Description' is displayed in a H1 tag so unless you have your keyword at the beginning of every news item description you are 'watering down' your keyword density and the list goes on... |
Guardian, I am not seeing this within the fisubice theme, at least with the home page view and with article view. Maybe that particular issue is theme related and can be corrected. Just a thought.
HOWEVER, you are still ultimately correct in everything that you say. Nuke essentially makes it very difficult to be SEO friendly, which is why a little re-factoring is in order.
BTW, one of the reasons why I am hesitant to jump to change ShortLinks is the install base of it, GTNG and GoogleTap, which all use a similar methodology. I feel like I need to figure out a slick way to properly 301 redirect the old URL's to any of the new ones... some day. I think it is doable, but always looking for better and better ways... |
|
|
|
|
Guardian2003
|
Posted:
Thu Jan 17, 2008 2:10 am |
|
Yes a degree of refactoring would be needed. If a theme doesn't show any H1 tags at all then you are losing the opportunity to 'emphasis' keywords so it's a lose-lose situation again.
I'm no expert though, I go purely on what gives me the best results either by experimentation or from what I learned when building my old 'hosting' websites which have extremely competative keywords. I don't have the time or inclination to take the 'experts' approach to SEO of adding a grain of sand here, take away a grain of sand there - if I can just chuck a bucket of sand at it and get a sand-pit back in return, that's good enough for me. In fact unless you are aiming for a highly competetive market you can re-use the same bucket over and over again to give you the 'essentials'.
I feel an article coming on lol... |
|
|
|
|
|