My never ending WP 2.6 Thread...

Re: My never ending WP 2.6 Thread…

Could be a banner rotating plugin or such, or simply an entry with nothing else and the title etc removed…

Re: My never ending WP 2.6 Thread…

[QUOTE=abostonboy;18394]Awesome! Thank you. Next question -

How do you get an ad like this
http://youlovegayporn.com/ (third post)

and like this -
http://queerclick.com/

(second post)

Is that a plug in?[/QUOTE]

Could be a plugin. I don’t know of one for 2.6 yet though.

Not sure if that one blog is wp or not. Its simple to do in BO.

Or they edited the index.php with some code to simply insert an ad or whatever after every x post.

Jimmy

Re: My never ending WP 2.6 Thread…

[QUOTE=abostonboy;18394]Awesome! Thank you. Next question -

How do you get an ad like this
http://youlovegayporn.com/ (third post)

and like this -
http://queerclick.com/

(second post)

Is that a plug in?[/QUOTE]

This should do the trick http://www.maxblogpress.com/plugins/mba/mba-use/
Works with 2.6

Re: My never ending WP 2.6 Thread…

I’d strongly recommend putting it in your theme template. Plugins can only slow things down and break.

Put it just after <body> if you intend to do event tagging, or just before </body> if you don’t intend to do event tagging.

Re: My never ending WP 2.6 Thread…

Ok. Here is another one. How do I use the no follow?

Re: My never ending WP 2.6 Thread…

go here http://codex.wordpress.org/Plugins/Plugin_Compatibility/2.6 scroll down to the M-P listings, there are several no follow plugins. Just read up on them and choose the one that fits what you’re wanting to have it do.

Re: My never ending WP 2.6 Thread…

The quick answer is to do the following:

<a href="http://www.gaydemon.biz/ rel=“nofollow”>Great Site</a>

Or if you want you can put a meta tag on the page so all links are nofollow by default. That would look like:

<meta name=“robots” description=“nofollow” />

Then if you wanted to undo the default nofollow you would do the following:

<a href="http://www.gaydemon.biz/ rel=“follow”>Great Site</a>

Thing is WordPress doesn’t make it easy. There are plugins that can help, but I always just drop down into code and edit the link.

I’ve always been a bit fan of nofollow, but I’m starting to change my mind. The issue is that regular people don’t use nofollow - just people who are SEO-aware. So there’s an argument to be made that using nofollow can hurt you in that it identifies you as someone who might be trying to mess with the system. It’s better to look like an average Joe who just happens to have perfect title tags and the like.

That said, you should not let search engine spiders follow paid links, and better yet, not know they exist. The way to handle this is to put all outbound links through a redirect script that you’ve blocked with robots.txt. Most of the major sites do this. I’m just now going through the pain of implementing one and retrofitting hundreds of blog posts with the new links. It’s a pain in the ass, but it gives you a lot of control since you can update all of a particular type of sponsor links with one change. So this weekend I went from PPS to revenue share and back to PPS on all of my links for a particular sponsor by just changing the link codes in one place (they had a special program running that I wanted to take advantage of).

Re: My never ending WP 2.6 Thread…

put all outbound links through a redirect script

Such As ?? and is it free :slight_smile:

Re: My never ending WP 2.6 Thread…

RawTop,

Are you using open ads?

Re: My never ending WP 2.6 Thread…

I use Platinum SEO which is similar to the All In One SEO plugin, it lets you have control of adding no follow, no index, or combination of same, for each post/page.

There is also the robots meta plugin that will give you the same options, on individual posts.

Re: My never ending WP 2.6 Thread…

I’m a programmer who likes writing custom code. Needing something and not really feeling comfortable with what I saw out there, I just started a simple PHP script that does 302 redirects. Here’s a snippet from it…

<?php
if ($_GET['lnk'] == 1){ /* Bareback That Hole */
header("Location:http://www.barebackthathole.com/visitor1.php?idWebMaster=1196801495&idBanner=579&SpecialProgram=0&type=linktext",TRUE,302);
}
elseif ($_GET['lnk'] == 2){ /* Felch That Hole */
header("Location:http://www.felchthathole.com/visitor1.php?idWebMaster=1196801495&idBanner=814&SpecialProgram=0&type=linktext",TRUE,302);
}
elseif ($_GET['lnk'] == 3){ /* Pop Boys */
header("Location:http://in.popboys.com/track/MTk4Nzo0Mzoy/",TRUE,302);
}
?>

So it’s pretty basic and I just call it something like this…

http://www.rawtop.com/scripts/outbound.php?lnk=3

It’s quick and dirty and does the job. In the future I can add database logging and the like if I want. I can already tell I want to create a database of all the links and have it generate and upload the file for me. But that’s part of a bigger project I want to work on - sort of my own personal Blogs Automater with a feature set that’s better suited for how I want to do things including image retrieval from sponsors, image cataloging, and a streamlined process to create collages and gallery pages…

Re: My never ending WP 2.6 Thread…

rawTOP, thanks for this very helpful advice. I’m going to implement a similar script on my blogs and update all my previous blog posts with it.

Re: My never ending WP 2.6 Thread…

I just had a brainstorm on the redirect script I mentioned above… I’m in the process of automating quite a bit of what I do (at least the parts that can be automated). So I’ve got a database that generates that file for me based on data in the database. Then it dawned on me… Instead of blocking that file with robots.txt, I’ll let bots crawl it but redirect the bots to category pages on my site appropriate to the outbound link. So a link that would send a user to hotbarebacking.com would redirect the bot to a page on my site about hotbarebacking.com… I’ll just modify the script to detect the user agent or the reverse IP lookup and work from there…

Re: My never ending WP 2.6 Thread…

In case anyone’s interested… Here’s a snippet from my modified routine that serves one page to users and another to spiders… When there isn’t a suitable equivalent I just send the spider to the home page of the porn blog with a 302 redirect (instead of 301)…

<?php
if (stripos($_SERVER['HTTP_USER_AGENT'], 'googlebot') > 0) { $spider = TRUE; }
elseif (stripos($_SERVER['HTTP_USER_AGENT'], 'msnbot') > 0) { $spider = TRUE; }
elseif (stripos($_SERVER['HTTP_USER_AGENT'], 'teoma') > 0) { $spider = TRUE; }
elseif (stripos($_SERVER['HTTP_USER_AGENT'], 'slurp') > 0) { $spider = TRUE; }
else { $spider = FALSE; }
if (FALSE) { }
/*  */
elseif (($_GET['lnk'] == 26) && ($spider)) { header("Location:http://www.rawtop.com/porn/",TRUE,302); }
elseif ($_GET['lnk'] == 26) { header("Location:http://www.gaydemon.com/dicktionary/".$_GET['t']."/",TRUE,302); }
/* Gunz Blazing */
elseif (($_GET['lnk'] == 44) && ($spider)) { header("Location:http://www.rawtop.com/porn/",TRUE,302); }
elseif ($_GET['lnk'] == 44) { header("Location:http://gunzblazing.com/ref.php?w=101203",TRUE,302); }
/* Gunz Blazing - hotbarebacking.com */
elseif (($_GET['lnk'] == 38) && ($spider)) { header("Location:http://www.rawtop.com/porn/labels/hot-barebacking.html",TRUE,301); }
elseif ($_GET['lnk'] == 38) { header("Location:http://gunzblazing.com/hit.php?w=101203&s=16&p=2&c=579&t=0&cs=",TRUE,302); }
?>

The GayDemon Dicktionary link case is how link trades will work (yes Bjorn, I am still working on that). That’s followed by a program referral that doesn’t have a suitable internal link and sends spiders to the home page of the porn blog. Then that’s followed by a site referral.

Re: My never ending WP 2.6 Thread…

[quote=rawTOP;18920]In case anyone’s interested… Here’s a snippet from my modified routine that serves one page to users and another to spiders… When there isn’t a suitable equivalent I just send the spider to the home page of the porn blog with a 302 redirect (instead of 301)…

<?php
if (stripos($_SERVER['HTTP_USER_AGENT'], 'googlebot') > 0) { $spider = TRUE; }
elseif (stripos($_SERVER['HTTP_USER_AGENT'], 'msnbot') > 0) { $spider = TRUE; }
elseif (stripos($_SERVER['HTTP_USER_AGENT'], 'teoma') > 0) { $spider = TRUE; }
elseif (stripos($_SERVER['HTTP_USER_AGENT'], 'slurp') > 0) { $spider = TRUE; }
else { $spider = FALSE; }
if (FALSE) { }
/*  */
elseif (($_GET['lnk'] == 26) && ($spider)) { header("Location:http://www.rawtop.com/porn/",TRUE,302); }
elseif ($_GET['lnk'] == 26) { header("Location:http://www.gaydemon.com/dicktionary/".$_GET['t']."/",TRUE,302); }
/* Gunz Blazing */
elseif (($_GET['lnk'] == 44) && ($spider)) { header("Location:http://www.rawtop.com/porn/",TRUE,302); }
elseif ($_GET['lnk'] == 44) { header("Location:http://gunzblazing.com/ref.php?w=101203",TRUE,302); }
/* Gunz Blazing - hotbarebacking.com */
elseif (($_GET['lnk'] == 38) && ($spider)) { header("Location:http://www.rawtop.com/porn/labels/hot-barebacking.html",TRUE,301); }
elseif ($_GET['lnk'] == 38) { header("Location:http://gunzblazing.com/hit.php?w=101203&s=16&p=2&c=579&t=0&cs=",TRUE,302); }
?>

The GayDemon Dicktionary link case is how link trades will work (yes Bjorn, I am still working on that). That’s followed by a program referral that doesn’t have a suitable internal link and sends spiders to the home page of the porn blog. Then that’s followed by a site referral.[/quote]

Sounds interesting, but what about the rules of cloaking, about serving a different page for spiders than surfers? This sounds an awful lot like cloaking.

From Google’s Terms

Cloaking Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index.
Some examples of cloaking include:

  • Serving a page of HTML text to search engines, while showing a page of images or Flash to users.
  • [U][B]Serving different content to search engines than to users.[/B][/U]
If your site contains elements that aren't crawlable by search engines (such as rich media files other than Flash, JavaScript, or images), you shouldn't provide cloaked content to search engines. Rather, you should consider visitors to your site who are unable to view these elements as well. For instance:

Re: My never ending WP 2.6 Thread…

Is keeping them away from something they have said they don’t want to see “cloaking”? Not really…

“Cloaking” is allowed to reduce duplicate content because duplicate content messes them up. Hence when you have links that do things like changing the number of products shown in a product list, you’re allowed to “cloak” those URLs to reduce the amount of duplicate content googlebot has to deal with.

Google has said they do not want to see “paid links” - that if they exist, they should be “nofollowed”. What I’m doing is keeping them away from something they’ve said they don’t want to see and redirecting them to pages on the same exact topic. I’m only using a 301 when the topic is identical. So, “Join HotBarebacking.com” would go to the page on my site that is all about HotBarebacking.com.

True, it’s not pure white hat, but it’s not black hat either… It’s more light gray hat…

[I should mention that Google doesn’t like to admit that their definition of cloaking covers things they allow - like duplicate content reduction. But it does. To them the term cloaking is only bad - there are no good forms of cloaking… But the real world is different. Rand Fishkin had a good post on it a couple weeks ago and it brought the whole issue up and Matt Cutts was in the discussion - pretty interesting…]

Re: My never ending WP 2.6 Thread…

[quote=rawTOP;18923]Is keeping them away from something they have said they don’t want to see “cloaking”? Not really…

“Cloaking” is allowed to reduce duplicate content because duplicate content messes them up. Hence when you have links that do things like changing the number of products shown in a product list, you’re allowed to “cloak” those URLs to reduce the amount of duplicate content googlebot has to deal with.

Google has said they do not want to see “paid links” - that if they exist, they should be “nofollowed”. What I’m doing is keeping them away from something they’ve said they don’t want to see and redirecting them to pages on the same exact topic. I’m only using a 301 when the topic is identical. So, “Join HotBarebacking.com” would go to the page on my site that is all about HotBarebacking.com.

True, it’s not pure white hat, but it’s not black hat either… It’s more light gray hat…

[I should mention that Google doesn’t like to admit that their definition of cloaking covers things they allow - like duplicate content reduction. But it does. To them the term cloaking is only bad - there are no good forms of cloaking… But the real world is different. Rand Fishkin had a good post on it a couple weeks ago and it brought the whole issue up and Matt Cutts was in the discussion - pretty interesting…][/quote]

True, the spirit is keeping things away they don’t want to see, but I just don’t feel like its a comfortable solution to give them a page while giving the surfer a different one. Would come awful close to the 2nd definition of Cloaking to make it scary enough to be cautious.

If the ‘nofollow’ and ‘robots’ work, why serve them a different page?

And too, if you are redirecting them to a more authoritive page, or one that is relevent to the link, that alone would make me wary of the site… from a Google perspective, as if you are trying to garner better SE position, by hiding the true link… paid or not. Seems a bit more than light grey, imho. even though the ‘intent’ isn’t, as you say but G isn’t going to know that.

Re: My never ending WP 2.6 Thread…

[QUOTE=Gaystoryman;18926]True, the spirit is keeping things away they don’t want to see, but I just don’t feel like its a comfortable solution to give them a page while giving the surfer a different one. Would come awful close to the 2nd definition of Cloaking to make it scary enough to be cautious.

If the ‘nofollow’ and ‘robots’ work, why serve them a different page?

And too, if you are redirecting them to a more authoritive page, or one that is relevent to the link, that alone would make me wary of the site… from a Google perspective, as if you are trying to garner better SE position, by hiding the true link… paid or not. Seems a bit more than light grey, imho. even though the ‘intent’ isn’t, as you say but G isn’t going to know that.[/QUOTE]

I wouldn’t call the pay site an authoritative page… lol

Nofollow tells them too much about your site. If they know the primary purpose is affiliate links there’s a good chance they’ll think less of your site. It also tells them you’re SEO-aware so they’ll start looking for ways in which you’re gaming the system (though the same could be said for a redirect script).

Robots.txt is perhaps the most white hat way to go, but then you have a lot of link juice accumulating in blocked URLs…

It’s a hard thing to argue… I’d say I’m revealing to them the nature of the link while keeping them away from affiliate pages they don’t want to see. But I get the other side of the argument as well…

The question is what to do with link exchanges like the one for GD Dicktionary… They say they don’t like link exchange networks either. I’m tempted to do a 302 on those and let them follow it…

Re: My never ending WP 2.6 Thread…

I thought you were redirecting the spiders to pages on your own site, not to a paysite :rolleyes:

I’ll let bots crawl it but redirect the bots to category pages on my site appropriate to the outbound link

Yes the recip links from GD Dictionary could be difficult, but if the pages linking together are on the same topic, it shouldn’t really be classed as a link farm or link scheme. Its sites with similar content linking. While G does seem to depreciate return links, I don’t see it being a huge problem with the Dictionary at GD, unless the links are on some link page, or plastered all over on irrelevant pages.

Re: My never ending WP 2.6 Thread…

I am. What made you think I wasn’t? Let me rephrase… Let’s say I have a link saying “Join Felch That Hole Now!” - the users would go to the pay site, the spiders would go to the page on my site that has my posts about FelchThatHole.com.

Actually, GD uses robots.txt to block spiders from outbound links. So there isn’t a reciprocal link issue… But there might be issues with reciprocal links to other sites that aren’t as well managed… Since GD blocks spiders from seeing the link to my site I don’t feel obligated to return a spiderable link, though the 302 to GD is more truthful than a 301 to a page on my site and won’t pass link juice, so I might do that too…

Maybe what got you confused is that on a link by link basis I can link out or link in using either a 301 or 302…