Screen Sizes. Again.

A couple of weeks is all it took for me to revisit this.

It seems that Apple read my previous post about using iPad-sized assets across all devices and said to themselves “Best put a stop to that nonsense.”

In the announcement about iOS 12.1, they said that they were going to be a lot more strict about memory management, with a specific focus on reducing the memory footprint of Metal apps.

SpriteKit ultimately uses Metal under the hood and, while I expect 2D games to be less resource hungry than bigger 3D experiences, I wouldn’t be surprised if my shortcut of using 1x assets sized for the largest iPad across all devices will fall foul of the new rules.

Plan A H N X

I resisted it for as long as I could, but it’s time to use point sizes and asset catalogs.

“But you said not use points! Most iPhones are 2x devices but so are all the iPads and yet the iPads need assets double the size of the iPhones!” You shout at me, deftly summarising my previous 2,000 words about this topic with a single sentence.

This is correct. I did say these things. It was a pragmatic approach to game development on Apple hardware—make asset generation as easy as possible but keep things looking great across all devices.

I did suspect I might have to do something about it once I had more experience. Now it seems that Apple is going to force my hand.

Letting points and asset catalogs do the thing they were designed to do is going to be the way forward.

The Downside to Points

As I mentioned before, this is not without its problems. For things to look crispy on iPads, assets need to effectively be 4x on a system expecting them to be 2x.

Halving Measurements

All positions, sizes, lengths, widths, move speeds, and anything else that relies on spatial measurements will need to be halved in order to make them suitable for the phone.

In order for this to work effectively, all game data will need to be removed from code.

This means no magic numbers. Anywhere. At all. Not even during development because I guarantee that if I let this one that I just threw in there to get things working slide, it will appear in production.

Scaling Down

The other major downside is that all these measurements ideally need to be even numbers so they can be halved cleanly.

The reason I’m doing it this way—using iPad measurements and halving them to get the iPhone measurements as opposed to using the iPhone measurements and doubling them—is because 2D raster assets should be created at the largest required size and scaled down.

If I was creating a game that used vector assets exclusively, I could (and probably should) create assets at the smallest 1x size and scale up.

Accessing the Data

Then it will be a case of creating two data files that contains all this spatial data (one for the iPhone and one for the Mac and iPad) and copying them in to the asset catalog.

Using asset catalogs to organise my data files lets iOS do all the heavy lifting when it comes to devise detection.

To get access it in the game, I can load it like this:

guard let assetData = DataAsset(named: “GameData”)?.data else {

    fatalError(“Game data missing! This data is probably important!”)


// Do things with the data

Creating the Scene Size

When it comes to actually setting up SpriteKit, it all depends on whether or not I’m using a camera. If I am, then I can just use the device screen size and the asset catalogs and scaling will take care of the rest:

let scene = SKScene(size: self.view.bounds.size)

Otherwise, if I want to use a fixed size and accept some clipping, then I’ll need to do some device detection on iOS:

let size : CGSize
if UIDevice.current.userInterfaceIdiom == .pad {
    size = CGSize(width: 2220, height: 1024)
} else {
    size = CGSize(width: 1110, height: 512)
let scene = SKScene(size: size)

The image asset documents will be 4440×2048 pixels. However, only the @2x iPads will use assets from this document at full size and, because those iPads are scaled @2x, the size passed to the SKScene initialiser needs to be given in points (2219×1024).

The phones will then use half this point size, with the actual pixel sizes being 2220×1024 on the @2x phones and 3330×1536 on the @3x phones.

Automating This Process

There are at least five files that need to be created for each asset: iPhone 2x and 3x, ipad 1x and 2x, and Mac 2x (Mac 1x would be just for completion as there are less strict memory limitations).

On top of this, I’ll need to keep track of which data files need to have their spatial data halved and generate three versions of these (iPhone, iPad, and Mac).

This is a lot of tedious file management, so I’m currently in the process of developing an asset management app (because of course I am) that will automate much of this.

Here’s what it will do:

  • Allow me add multiple different folders of files to a single asset collection
  • Filter out device-specific assets for easier browsing and organisation (i.e. changing the name of one asset will change it across all device-specific files)
  • Copy all assets in any of the folders directly in to the asset catalog, creating the necessary asset catalog JSON so that Xcode knows what’s happening
  • Support audio, game data, and colours as well as images
  • Create and preview animations and generate animation JSON

The Right Way is Often the Hard Way

As expected, all of this is a lot more work than just using 1x assets targeted for the largest screens across all devices.

However, it does enforce a certain amount of development discipline.

Firstly, memory usage will be a lot more appropriate for each device and will make me a more responsible iOS developer.

Secondly, game data should never be part of the engine but it’s all too easy to just throw in a property with a fixed number, especially when getting things up and running quickly.

Now that I know that there will be game-breaking consequences for this, I’ll need do the extra work up front.

It took me many months to get here, but I’m finally doing the right thing.