On traffic bots

Linden Research Inc. has finally decided to do something about traffic bots… and despite that I very much dislike traffic bots, I don’t like the decision at all. I will, as usual, make a prediction, so it’s here so that it will be on file when things go wrong. As much as I hate to say ‘I told you so’, this is one of my special knacks, after all.

Let us formulate the original problem.

  1. Traffic score is a component of the parcel search ranking.
  2. It is in the shop owner’s interest to be higher in search ranking because it should, in theory, drive more people with intent to spend money (and without a clear idea what do they want to spend it on) to their shop.
  3. Artificially inflating traffic makes the search system largely unusable unless you’re looking for a known brand name.
  4. It is therefore desired that artificial inflation of traffic would no longer happen.

The possible solutions to this problem can be divided into two groups of possible decisions: Restricting the use of bots in one fashion or another, or restricting the effects of traffic score in one fashion or another. The two extreme solutions would be to ban all bots at all, and to remove traffic calculation completely.

Banning all bots at all is obviously unacceptable for multiple reasons. Bots are the only answer to the deficiencies of Second Life technology, and have numerous industrial uses — without bots, importing objects from external software would be such pain that it would never be practical, bots are the only way to perform numerous things that a script is not allowed to do but an avatar is, and in general, using a bot just to inflate traffic is just like using a stack of microscopes as a counterweight of a trebuchet. Moreover, it is impossible to readily distinguish a bot from a customized client, and it is, in fact, possible to hack up the standard client in such a way that it is still indistinguishable from the server side, but is programmable to perform actions, which would make it a bot by definition.

Removing traffic completely is not so unacceptable at all — after all, Google ranks pages without any regard for how often they’re read and it somehow works. It uses other values, notably, the number of times the pages are referred to, but this is not, by any means, equivalent to avatar dwell traffic.

Yet they chose a middle ground, and that middle ground does not solve the problem, imposes extra work on Lindens (don’t they have enough to do?) and creates extra problems. How?

  1. The announcement says that Lindens will be monitoring prospective bot farmers themselves. Extra work for Lindens.
  2. Other uses of bots which are explicitly allowed by the new policy, namely, shop mannequins, are permitted and cannot be readily distinguished from traffic botting.
  3. Before traffic bots there was camping, which is not prohibited by this policy. Traffic bots that are being paid ridiculously low rates are not readily distinguished from campers either.
  4. The incentive to inflate traffic is still there, and ways to do it will still be found, since inflating traffic has not been made ‘physically’ impossible.

Campers, in general, are a worse drain on sim resources than bots, due to being better dressed and out in the open, they require numerous scripts to handle them and hundreds of small transactions to pay them. Every underhanded technique to get people to camp that was previously not effective enough compared to bots will be pulled out of the closet, and we can expect much worse atrocities and grievances — for example, I don’t think a camper can (or will) AR when they haven’t been paid for camping, as Linden Research Inc. will not enforce contracts between residents at all. The very first thing shop owners will do, since it doesn’t require too much effort, is dressing up their bots as mannequins and putting them in full view of the shoppers. The shops that do use mannequins legitimately (Edelweiss with it’s L$600 maid dresses does have a good reason to employ mannequins, that’s the best way to show off that the outfit is worth the price, quite high by Japanese standards) will suffer, too.

Meanwhile, there are other intermediate solutions that do not involve such potential for abuse and so much manual work to enforce. There definitely is a way to create a traffic calculation that requires so much research to game it with bots that such gaming is too impractical to attempt. For example…

  1. Traffic awarded to the parcel by the avatar is equal to 0 until the time spent on the parcel reaches a value X.
  2. Traffic awarded to the parcel by the avatar becomes negative if the avatar spent more than Y minutes.
  3. Traffic is calculated not daily but weekly.
  4. If an avatar visited the parcel multiple times during the accounting period, only Z of their visits are counted, and which Z visits are picked and how is kept secret.

This works more or less like this: Assuming that you posess a bot, if you let it hang around the parcel forever, your traffic will go down. If you try to find out the value of Y so that your bots flicker in and out of the parcel, it will take quite a few weeks of experiments even if you have a good guess of what Y is. Just flickering bots in and out and hoping for the best is just as likely to destroy your traffic score as it is likely to increase it. If the values of X, Y and Z are not constant, but depend on something that changes over time, multiple times during the accounting period — for example, the current online count at the moment the avatar entered the parcel — discovering them and the way they change over time will require many months of expensive rigorous experimentation. And if someone does discover it, you can tweak the formula and leave them in the dirt.

Sure, it’s a more complex to code, but code is written once, while manual enforcement takes manpower forever. While there would still be an incentive to have high traffic, gaming it with bots or campers would become impossible and parcel owners would just have to try making their parcel interesting and hope for the best.

Oh, the announcement also mentions landbots. Despite what some people think, the text actually implies that the land bots will not be forbidden, they will merely stop working because the ability to purchase a parcel inworld will be removed and parcels will only be bought through a web interface.

So the land bots will give way to web land bots which can be far less resource-hungry, cheaper to run and readily available even for those who could not get their way through the labyrinthine documentation to libsecondlife/libopenmetaverse.

Way to go, gentlemen. Where’s the sarcasm tag when you need it?…

Advertisements

3 thoughts on “On traffic bots

  1. “We will continue to strive toward providing more statistical data to land owners, including the number of visitors they receive. However, the way these statistics relate to Search ranking will be changing. In the next few months, we will be making both technical and policy changes to the way relevance and ranking works in Search. The “traffic” score will be only one aspect of the ranking logic, and it will be scrubbed and weighted to account for gaming vectors.”

    Not only will there be a clamp down on blatant over use of bots, there will be greater efforts to control gaming of the search system in general. LL uses the Google search box and it has a lot of flexible options to handle search result gaming. No complex code required. 😉

  2. I will believe it when I see it, and other mythical things that they promised for years, like llTeleportAgent. Meanwhile, have fun camping. 🙂

  3. I’ve doubted campers are actually campers for a long time. It was just a progression from real campers to bots camping. Saves a tonne of money and all those pesky transactions.

Comments are closed.