WordPress plugin : Google XML Sitemaps 4.0 significant changes

One of the most popular WordPress plugins – Google XML Sitemap – has recently been upgrade to version 4.0, with some significant changes.  Here is the quote from the changelog:

New in Version 4.0 (2014-03-30):

  • No static files anymore, sitemap is created on the fly!
  • Sitemap is split-up into sub-sitemaps by month, allowing up to 50.000 posts per month!
  • Support for custom post types and custom taxonomis!
  • 100% Multisite compatible, including by-blog and network activation.
  • Reduced server resource usage due to less content per request.
  • New API allows other plugins to add their own, separate sitemaps.
  • Note: PHP 5.1 and WordPress 3.3 is required! The plugin will not work with lower versions!
  • Note: This version will try to rename your old sitemap files to *-old.xml. If that doesn’t work, please delete them manually since no static files are needed anymore!

11 ways you can tweet better

11 ways you can tweet better

  1. Use a hashtag to drive the conversation.
  2. Organize a live event using Twitter.
  3. Tweet the past as if it were the present.
  4. Have a celebrity take over your account.
  5. Join the conversation where it takes place.
  6. You ask the questions.
  7. Live-tweet a breaking news event.
  8. Team up with the co-stars and use hashtags to drive fans to a TV show.
  9. Use Vine videos.
  10. Use a custom timeline to curate the best content.
  11. Voting and displaying on air

Website traffic, the learning curve

I’ve built plenty of websites over the years.  Some – from scratch, others – mere customizations and adaptation of someone else’s work.  But when it came to web promotion, I’ve usually handed it over to someone else.  Don’t get me wrong – I have a pretty good idea about how these things work, but I didn’t keep up and I haven’t practiced in a long while.

Currently, I am involved in the project, where the web promotion bit is my responsibility.  Until the project grows and earns enough to hire a professional.  So I’m using it as a platform to refresh my knowledge, catch up with current trends, tools, and techniques, and to try out a few ideas of mine own.  It is an interesting experience.

One thing I like is that the website is brand new on a very young domain with no previous history.  The A/B testings and statistics cuts are very clean.  There is an opportunity to measure the effects of this or that campaign with a lot of precision and no interference from any other traffic sources.

A lot has changed since I did it the last time.  One thing that amazes me is how dirt cheap the web traffic is these days.  I mean that when I first went in to buy some, I had a price in my head.  I paid less and I got more than I expected.  Then I studied it for a few days and got a way better price.  Then I tried something else and got an even better price.  I’m sure I’m not at the end of the tunnel yet either.

Of course, this is a random, not targeted, pretty much not convertable traffic.  But it does have its pros this early in the game, and given the price – it’s well worth it.  Even with that I’ve got more conversions than I hoped for.

Let me mention it once again – I am pretty much a newbie in the practical terms of this.  If you have any advice or any resources that you think might help me out – please share and let me know.  Once I get a better hand of it, I’ll share my thoughts and experiences too.  Right now though it’s too embarrassing to do so.

Content authorship is a new cool

Here is a quote directly from Google’s Inside Search blog:

We now support markup that enables websites to publicly link within their site from content to author pages. For example, if an author at The New York Times has written dozens of articles, using this markup, the webmaster can connect these articles with a New York Times author page. An author page describes and identifies the author, and can include things like the author’s bio, photo, articles and other links.

If you run a website with authored content, you’ll want to learn about authorship markup in our help center. The markup uses existing standards such as HTML5 (rel=”author”) and XFN (rel=”me”) to enable search engines and other web services to identify works by the same author across the web. If you’re already doing structured data markup using microdata from schema.org, we’ll interpret that authorship information as well.

[…]

We know that great content comes from great authors, and we’re looking closely at ways this markup could help us highlight authors and rank search results.

In simple terms, this means that you should make sure that all your content – no matter where it is published – identifies you as an author.  This will help link all your content together, create your author profile, and use that as yet another criteria in ranking and searching.  Those of you publishing with WordPress shouldn’t worry at all – adding authorship is either already done or will take a minor modification to the theme. WordPress provided both author pages and XFN markup out of the box for years.

Pinging back your own posts

Weblog Tools Collection blog is asking if pinging your own previously published posts is a good idea.  Their consideration is the updated Google’s ranking algorithm which decreases the rating of sites to which it has ‘nofollow’ links – exactly the kind that pingback creates.

In my understanding, whenever you have a choice between improved search engine optimization (SEO) and improved user experience, always go for the user experience.  Pingbacks provide a valuable navigation path to updated content.  This is a way for the author to say that there is a development to the story or a much related content is available elsewhere on the site.

If Google, or any other search engine, penalizes such behaviour, I  am more than sure that this is very temporary. Even technically, making an exception to nofollow links within the same domain is a trivial change.

Getting rid of pingbacks to your own posts can and will seriously harm visitor’s navigation of the site.  And that would decrease your page views, incoming links, and everything else that is related to human activity.  Do you still have the question of pingbacks unanswered?

Google does not use keywords meta tag

I’ve been explaining this to way too many people over the last few years – Google does not use keywords meta tag for ranking search results.  Which means you can totally drop it, or leave it empty, or fill it with whatever you want at all.  It just doesn’t matter.

And the reason for that is really simple.  SPAM and search ranking manipulation.  It started even before Google was around.  Back when Altavista and Yahoo were fighting for the title of the best search engine.  Altavista was using quite a bit of keywords meta tags.  And, as a result, you could often see sites which had nothing to do with the search query still rank at the top.  If I remember correctly, even Google paid some attention to that meta information at the beginning, but it was quite obvious pretty soon that it cannot be trusted.

Now, if you don’t believe me on this subject, either watch the video or read the explaination directly at the Google Webmaster Blog.  Satisfied?  Now stop spending hours upon hours of everyone’s time trying to develop the perfect tool and pick the perfect keywords.  It just doesn’t matter.

Forget the URL

Just yesterday I was talking with a few people about the Web and how things are changing, and the subject of URLs came up.  And I shared my opinion on the matter – URLs don’t matter.  Before you start arguing, I want to make it clear that I do understand that there are exceptions to every rule and no assumption stays true forever.  But on the other hand, that’s what I believe.

I think every Web surfer answers this question to himself at certain point in time.  Are all these URLs that I visited, searched for, and bookmarked over the years matter?  And I suspect that quite a few of those people will answer as I do – “no”.  I had a sneaky suspicion for some time, but it was delicious social bookmarks web service that made me confident about this.

The thing with delicious is that initially it was available at http://del.icio.us .  And as you might guess, there quite a few problems with this URL.  Firstly, the word “delicious” is not in the active vocabulary of most non-native English speakers.  Secondly, even those who know the word, have troubles spelling it correctly.  Thirdly, those who know how to spell it, never seem to guess where to put the dots.  And fourthly, the logical line between the meaning of the word “delicious” and a social bookmarking web site is vague at best.

However, that didn’t stop del.icio.us from becoming the most popular social bookmarking web site on the Web.  And that was when I became confident in that the URLs don’t matter any more.  Very few people will remember them.  Most people will find the site with the help of a search engine.  And those who are really interested in getting back to it, will bookmark it.  It’s that simple.

What about brand names, you ask?  Brand names are important.  But you can avoid linking brand names to URLs.  What about people’s names?  Only a few will remember them.  What about original, non-standard domain names?  Only few of those will be remembered, the rest will search and bookmark.  Why do I have mamchenkov.net domain name then?  Because it was available and because it links to my name nicely.  If it wasn’t there, I’d use something else.  And, in fact, I did use a couple of other domains before I registered mamchenkov.net .

Why all of a sudden I started talking about it?  Because today I came across someone else saying practically the same thing – “Do URLs matter anymore?” article over at CNET News, and I quote:

People still try to trade the most simple URLs for hopeful hundreds of thousands. They will still line up in the hope of getting a vanity URL from Facebook.  But don’t most people simply go to the little search box, type in the name of what they’re looking for, and search?  If it’s something they want to go back to, they’ll bookmark it. But they won’t remember what the URL is. For the simple reason that they don’t need to.

Can you handle the popularity?

Looking around the blogosphere, I see more and more bloggers who work really hard on promoting their sites.  They optimize their themes for Google, submit blog to all sorts of directories, share links to their best content via social networks, microblog, and comment all around the web.
Well, that’s all fine.  But here is the questions – can they handle the popularity?

I’ve been thinking about it before, but it came all to me suddenly yesterday and today.  One of my recent posts got submitted to reddit.com and it somehow it went through to the main page of the site, and from there got aggregated via RSS to a lot of other places.  Within 24 hours, my blog received more than 20,000 views.  Compared to an average day, which brings much under a thousand, that’s a lot.

This sounds like a dream come true for any blogger, no?  Well, it is, sort of.  But.  Consider the other side of the story, which is not so obvious from the first glance:

  • My hosting company handled the spike really well – no complaints or disconnections.  Not all hosting companies are created equal.
  • Commenting form on my blog was broken at the time of the spike.  It was down the whole spike duration.
  • There were about 500 comments posted in the reddit.com thread.
  • I’ve received almost 100 emails.
  • When commenting form got fixed, I got another dozen or so comments, plus another SPAM wave along with it.

If you imagine for a moment all that coming upon you in the middle of the working week, you’ll see a problem.  Who and how should respond to all that?

I’ve spent half a day today talking with my hoster about the commenting form.  Gladly it got fixed (the problem was session misconfiguration on the hosting company side).  Then I needed some time to respond to all those emails. In the meantime I quickly reviewed and approved all comments in the moderation queue.  That pretty much ate my day, together with some things I managed to slide in at work.

Later in the evening, when my family went to sleep, I actually read all the comments and responded to a few.  I also read through most comments at reddit.com .  Can I reply to any of those?  Nope.  That’s out of my resources.  I can’t handle all the traffic that came in.

Can you?  What will happen to your server if you’ll get digged or slashdotted?  How can you moderate all the comments?  How can you handle replies?  What about comments at other places – blogs, forums, and social networks that brought you in the traffic?  Do you have any moderators on standby?  Do you have any monitoring setup (Google Alerts, coComment, etc) for remote discussions about your content?

If you aren’t thinking about those things while promoting your blog, you are in for a big surprise…

Tip for web promoters

If you care about web promotion of your web site, if you post articles titled “10 steps to do XYZ” or “ABC in 3 minutes”, if you want your blog posts to be bookmarked across all social networks, if you follow your incoming links with more attention than your personal hygiene, then here is a tip for you.

Look at the limitations that social bookmarking services impose on their users.  Make sure that you provide a quick way to bookmark your site with sufficient information which is within those limitations.

Take del.icio.us for example. Which limitations does it impose on the users?  There are a few, but the main one is the length of the description.  Whenever I bookmark your web site, I can only post 255 characters of the description.  This is too much and this is too little.

This is too much if I will have to type my own description.  I don’t have the time to describe all the web sites that I bookmark.  For many of them, I don’t even have any idea of what to write, since I bookmark the web site to check it out later… So whenever I bookmark a web site, I look around for a quick way to generate that description.  And the easiest and fastest way is always a copy-paste.

That’s where that description length limitation becomes too small.  Most web sites have an “About” page these days.  But it’s too long for a description.  A couple of paragraphs could do, and I can almost always find those paragraphs to copy-paste, but they almost never fit into 255 characters.  That’s where you come in.

First of all, make sure that there is a piece of text, less than 255 characters long, that gives me an idea of what the article is or post or page or web site is about.  Secondly, make sure that I can find that piece of text easily.  Make it bold.  Put a border around it.   Slap a “Synopsis” or “About” or “In brief” label somewhere nearby.  You can even go as noisy as “del.icio.us users might want to use this as description: …”.

Why would you want to go into all that trouble?  Because this will help me, your visitor, to keep my bookmarks organized and annotated.  I will be able to find this bookmark much faster later on.  And that means that chances of me coming back, of me sending this link to someone else, or blogging about it are much higher.  And that is what you, as a web promoter, want.  Isn’t it?