Geographical Calculation in OpenRCT2

Geographical Calculation in OpenRCT2

I’ve been quietly working on the foundations of Mapsy Analytics, a dedicated platform for intelligently understanding the motion and interaction between anonymized user cells, which can be applied to a number of applications:

  • Recgonizing, classifying and predicting time-sensitive, context-aware motion.
  • Crowd annealing for congestion threat minimization.
  • Geographical-context driven directed marketing.

Mapsy Analytics is driven by machine learning, where parallelized neural nets are used to capture the traditionally complex analysis of loosely-defined problems such as crowd motion, where the computational aptitude towards recognizing patterns and trends escape the bounds of conventual human understanding.

For a neural network to function, it first needs to be trained against a substantial amount of input training data. One of the most well known applications of neural networks are for machine vision, which has subsequently seen the publication of many openly accessible test datasets such as the MNIST handwritten digit database.

Unfortunately, the same cannot be said for the processing of user motion, which is by comparison still in it’s infancy. So, we’ll have to work from scratch.

Mapsy treats sensitive user data such as precise location very seriously, and besides the obvious security, privacy, and power implications of polling user locations in the background, it’s fundamentally inappropriate for the generation of a test dataset, as it’s impossible to classify thousands of users just doing their own thing.

Mapsy is geared towards social geographical domains, such as festivals, conferences and tourist attractions; this implies that our training dataset needs to encompass a variety of specific user characteristics oriented around the following three pillars:

  • Destination arrival and expiry.
  • Static feature attraction such as a theme park ride or toilet.

    • Yes, this is one of the few circumstances where a toilet can be considered a feature.
  • Dynamic feature attraction, like a band performance or a ad-hoc party that breaks out around three in the morning somewhere in the campground.

So, although unideal, we’ll have to resort to a simulation of these kinds of interactions. But here; we’re in luck. There’s a fantastic simulation already available.


OpenRCT2 is an open source C++ port of Rollercoaster Tycoon 2, the tremendous theme park management simulator originally released in 2002. The amazing work of the OpenRCT2 Team has opened up some incredible opportunities for the open source community:

OpenRCT2 is a tremendous solution for the generation of test datasets for Mapsy Analytics, as it provides a tremendous high-level emulation of our three requirements of social geographical analysis.


To emulate the production release of Mapsy Analytics, we were required to have the ability to rig each sprite in the park, known as a Peep, with a GPS Transmitter. Back in the real world, active Mapsy users utilize the onboard GPS in the mobile phone, so we’ll be emulating Peep positions as a real-world location that can sent as an API request to Mapsy’s OAuth 2.0 backend via the C++ SDK.

This requires a conversion of a Peep’s { x, y, z } into a corresponding GPS position. We can achieve this using a couple of known properties:

From the RCT Fandom:

For example, Arid Heights (88 x 88 tiles) has an area of 851,840 square feet. Dividing 851,840 by 7,744 (88 x 88) gives 110 square feet per tile. The square root of 110 is approximately 10.5 feet.

In reality, the scalar is 10.4880884817. We care about this because this level of precision actually matters for precise latitude and longitudes calculation, where deviations in a couple of decimal places can result in kilometers of variation.

From the Codebase:


case PeepPickupType::Place:
  res->Position = _loc;
  if (network_get_pickup_peep(_owner) != peep)
  if (!peep->Place({ _loc.x / 32, _loc.y / 32, _loc.z }, false))
      return MakeResult(GA_ERROR::UNKNOWN, STR_ERR_CANT_PLACE_PERSON_HERE, gGameCommandErrorText);


// Set the coordinate of destination to be exactly
// in the middle of a tile.
CoordsXYZ destination = { location.x * 32 + 16, location.y * 32 + 16, tileElement->base_height * 8 + 16 };

When picking up and dropping a Peep in OpenRCT2, it aims to drop them in the centre of a tile. The calculation above shows us that a tile is 64 x 64 pixels.

This shows us that one increment by x or y is equivalent to:

const feetPerPixel = 10.4880884817 / 64;
const metersPerPixel = feetPerPixel * 0.3048; //  0.04994952139, which is roughly `5cm`.

So, if we assume a latitude and longitude for the origin of a coordinates, the geolocation of a Peep can be assumed to be:

const getPeepLocation = (peep, parkOrigin) => {
  const metersPerPixel = 0.04994952139;
  const { x, y } = peep;
  const { latitude, longitude } = parkOrigin;
  const radiusOfEarthMeters = 6378173;

  const offsetNorth = y * metersPerPixel;
  const offsetEast = x * metersPerPixel;

  const dLat = offsetNorth / radiusOfEarthMeters;
  const dLon = offsetEast / (radiusOfEarthMeters * Math.cos(Math.PI * latitude / 180));

  return {
    latitude: latitude + dLat * Math.PI / 180,
    longitude: longitude + dLon * Math.PI / 180,

Algorithm sourced by William’s Aviation Formulae.

This allows us to then take any Peep assigned to a virutal GPS device to transmit their location to the Mapsy Analytics API for online learning and subsequent classification.