Entries in games (13)

Saturday
Dec202008

Second Life Architecture - The Grid

Update:Presentation: Second Life’s Architecture. Ian Wilkes, VP of Systems Engineering, describes the architecture used by the popular game named Second Life. Ian presents how the architecture was at its debut and how it evolved over years as users and features have been added. Second Life is a 3-D virtual world created by its Residents. Virtual Worlds are expected to be more and more popular on the internet so their architecture might be of interest. Especially important is the appearance of open virtual worlds or metaverses. What happens when video games meet Web 2.0? What happens is the metaverse.

Information Sources

Platform

  • MySQL
  • Apache
  • Squid
  • Python
  • C++
  • Mono
  • Debian

What's Inside?

The Stats

  • ~1M active users
  • ~95M user hours per quarter
  • ~70K peak concurrent users (40% annual growth)
  • ~12Gbit/sec aggregate bandwidth (in 2007)

Staff (in 2006)

  • 70 FTE + 20 part time
"about 22 are programmers working on SL itself. At any one time probably 1/3 of the team is on infrastructure, 1/3 is on new features and 1/3 is on various maintenance tasks (bug fixes, general stability and speed improvements) or improvements to existing features. But it varies a lot."

Software

Client/Viewer
  • Open Source client
  • Render the Virtual World
  • Handles user interaction
  • Handles locations of objects
  • Gets velocities and does simple physics to keep track of what is moving where
  • No collision detection
Simulator (Sim) Each geographic area (256x256 meter region) in Second Life runs on a single instantiation of server software, called a simulator or "sim." And each sim runs on a separate core of a server. The Simulator is the primary SL C++ server process which runs on most servers. As the viewer moves through the world it is handled off from one simulator to another.
  • Runs Havok 4 physics engine
  • Runs at 45 frames/sec. If it can't keep up, it will attempt time dialation without reducing frame rate.
  • Handles storing object state, land parcel state, and terrain height-map state
  • Keeps track of where everything is and does collision detection
  • Sends locations of stuff to viewer
  • Transmits image data in a prioritized queue
  • Sends updates to viewers only when needed (only when collision occurs or other changes in direction, velocity etc.)
  • Runs Linden Scripting Language (LSL) scripts
  • Scripting has been recently upgraded to the much faster Mono scripting engine
  • Handles chat and instant messages
    • Asset Server
      • One big clustered filesystem ~100TB
      • Stores asset data such as textures.
      MySQL database Second Life has started with One Database, and have subsequently been forced into clustering. They use a ton of MySQL databases running on Debian machines to handle lots of centralized services. Rather than attempt to build the one, impossibly large database – all hail the Central Database – or one, impossibly large central cluster – all hail the Cluster – Linden Lab instead adopted a divide and conquer strategy based around data partitioning. The good thing is that UUIDs– 128-bit unique identifiers – are associated with most things in Second Life, so partitioning is generally doable. Backbone Linden Lab has converted much of their backend architecture away from custom C++/messaging into web services. Certain services has been moved off of MySQL – or cached (Squid) between the queries and MySQL. Presence, in particular Agent Presence, ie are you online and where are you on the grid, is a particularly tricky kind of query to partition, so there is now a Python service running on the SL grid called Backbone. It proved to be easier to scale, develop and maintain than many of their older technologies, and as a result, it plays an increasingly important role in the Second Life platform as Linden Lab migrates their legacy code to web services. Two main components of the backbone are open source:
      • Eventlet is a networking library written in Python. It achieves high scalability by using non-blocking io while at the same time retaining high programmer usability by using coroutines to make the non-blocking io operations appear blocking at the source code level.
      • Mulib is a REST web service framework built on top of eventlet

      Hardware

      • 2000+ Servers in 2007
      • ~6000 Servers in early 2008
      • Plans to upgrade to ~10000 (?)
      • 4 sims per machine, for both class 4 and class 5
      • Used all-AMD for years, but are moving from the Opteron 270 to the Intel Xeon 5148
      • The upgrade to "class 5" servers doubled the RAM per machine from 2GB to 4GB and moved to a faster SATA disk
      • Class 1 - 4 are on 100Mb with 1Gb uplinks to the core. Class 5 is on pure 1Gb
      Do you have more details?

      Click to read more ...

Wednesday
Oct222008

EVE Online Architecture

EVE Online is "The World's Largest Game Universe", a massively multiplayer online game (MMO) made by CCP. EVE Online's Architecture is unusual for a MMOG because it doesn't divide the player load among different servers or shards. Instead, the same cluster handles the entire EVE universe. It is an interesting to compare this with the Architecture of the Second Life Grid. How do they manage to scale?

Information Sources

Platform

  • Stackless Python used for both server and client game logic. It allows programmers to reap the benefits of thread-based programming without the performance and complexity problems associated with conventional threads.
  • SQL Server
  • Blade servers with SSDs for high IOPS
  • Plans to use Infiniband interconnects for low latency networking
  • What's Inside?

    The Stats

    • Founded in 1997
    • ~300K active users
    • Up to 40K concurrent users
    • Battles involving hundreds of ships
    • 250M transactions per day

    Architecture

    The EVE Cluster is broken into 3 distinct layers
    • Proxy Blades - These are the public facing segment of the EVE Cluster - they are responsible for taking player connections and establishing player communication within the rest of the cluster.
    • SOL Blades - These are the workhorses of Tranquility. The cluster is divided across 90 - 100 SOL blades which run 2 nodes each. A node is the primarily CPU intensive EVE server process running on one core. There are some SOL blades dedicated to one busy solar systems such as Jita, Motsu and Saila.
    • Database Cluster - This is the persistence layer of EVE Online. The running nodes interact heavily with the Database, and of course pretty much everything to do with the game lives here. Thanks to Solid-state drives, the database is able to keep up with the enormous I/O load that Tranquility generates.

    Lessons Learned

    There are many interesting facts about the architecture of the EVE Online MMOG such as the use of Stacless Python and SSDs.
    • With innovative ideas MMO games can scale up to the hundreds of players in the same battle.
    • SSDs will in fact bridge the gap huge performance gap between the memory and disks to some extent.
    • Low latency Infiniband network interconnect will enable larger clusters.
    Check out the information sources for detailed insights to the development and operation of the EVE Online game.

    Click to read more ...

Friday
Oct172008

Scaling Spam Eradication Using Purposeful Games: Die Spammer Die!

Update: As expected I'm undergoing a massive spam attack for speaking truth to dark powers. This is the time to be strong. Together we can make a change. What change you may ask? I can't say, just change and lots more change. Let's link arms together and bravely stand against the forces of chaos for a better yesterday and a better tomorrow. CAPTCHA doesn't work. Even Google can't make CAPTCHA work (Spammers Choose GMail). And even if CAPTCHA worked it wouldn't really work because CAPTCHA solving markets (Inside India’s CAPTCHA solving economy) have evolved where for a mere $2 you can buy 1000 human broken CAPTCHA's. And we know once the free market tackles a problem that's it. Game over :-) Making ever more clever CAPTCHA programs won't outwit and outlast the CAPTCHA solving markets. Until Skynet evolves the only way to defeat humans is with humans.

Using Games to Get Humans to Do Work (like CAPTCHA) for Free

How do we harness the power of humans to do battle with the CAPTCHA solving networks, without, of course, paying them anything? We make it a game! In particular we make a Game With a Purpose (GWAP). Read all about GWAPs in Designing games with a purpose. A GWAP is a game in which people, as a side effect of playing, perform tasks computers are unable to perform.

Google's Image Labeler

A good example GWAP is Google's Image Labeler, a game in which people provide meaningful, accurate labels for images on the Web as a side effect of playing the game; for example, an image of a man and a dog is labeled "dog," "man," and "pet.". Now this sounds like work. And it is. But because it's made into a game people will do it for free! An example Labeler session looks like: In the game two people are matched at random to label the same set of images. Points are awarded when you and your partner match labels. Top scores are kept so you can earn your label street cred. But can't people cheat? GWAP games include cheating detection mechanisms, but we won't go into detail here, see Designing games with a purpose for cheater foiling strategies.

ESP Game, Tag a Tune, and Squigl

More games can be found at the GWAP Home Page. They have the ESP Game which is like Labeler. Tag a Tune is a game where players hear tunes, describe them, and through the description guess if they are listening to the same tune. In Squigl partners see an image and a word. Using the mouse each player traces the object described by the word in the image. Winning is when both players trace the same image. Here's what a Squigl session looks like: So you see the pattern. Players are picked from a pool. They are asked to do some task that's hard for computers to do. The task must be structured so that winning enables the system to learn something valid while providing a feeling of game play for the humans. Points are awarded and scores are kept to keep the poor human slaves playing.

Creating a Spam Catcher Game

With the basic ideas in place let's create a game for identifying and filtering out comment spam. According to Designing games with a purpose this appears to a be an output-agreement type game, which has the following structure:
  • Initial setup. Two strangers are randomly chosen by the game itself from among all potential players;
  • Rules. In each round, both are given the same input and must produce outputs based on the input. Game instructions indicate that players should try to produce the same output as their partners. Players cannot see one another's outputs or communicate with one another; and
  • Winning condition. Both players must produce the same output; they do not have to produce it at the same time but must produce it at some point while the input is displayed onscreen. Simple enough. But comments exist as a part of blogs, websites, microblogging engines, and other programs. Any game has to interface with live systems. Integrating the game with a comment system might work something like:
  • User comments are sent from an originating system to a decentralized game comment queue.
  • Comments are pulled from the queue as new games start. Posts are stripped of identifying information and presented to the players.
  • Points are allocated if both players agree that a comment is spam or not spam within a very short period of time. With comments latency is the name of the game so they need to be processed as fast as possible.
  • Comments and the spam judgments are sent back to the originating system for handling. It's not too hard too imagine such a system being used for content other than comments and for making judgments like age appropriateness and other subtle criteria that could be communicated using site meta data. One UI idea it to make the game like a first-person-shooter. Spam is blasted into a 1000 pieces. Oh that would be rewarding, but you can also imagine all the usual game type mechanisms to keep people interested. An accuracy feedback loop would be useful to rate players so less accurate players could be dropped from the game. Players would be recruited from the general population. Another good source of players is the site owners and the site participants who's sites are the source of comments. This would be sort of Internet Comment Tax for keeping the Internet safe and sane. I, for example, would sign up to process 500 comments a week in order to have HighScalability.com comments processed by the game. Everyone else taking advantage of the system could pledge a number that made sense for their site. This would provide a ready pool of motivated players and docents to keep the game running efficiently. A nice widget system would make it possible to play the game from any site.

    The Final Move

    Spam crushes many sites. Many site owners don't even allow comments anymore because of the time it takes to deal with spam, which is a shame, because without interactivity the internet might as well be a newspaper. We can't let those spammers win! A system like the Spam Catcher Game might be able provide the human oversight, quick latency, and high throughput needed to out compete the CAPTCHA solving networks. The game is finally afoot!

    Related Articles

  • GWAP Home
  • Designing games with a purpose
  • Inside India’s CAPTCHA solving economy
  • Spammers Choose GMail
  • Google's Image Labeler
  • Google Crashing

    Click to read more ...

  • Page 1 2