How To Upscale
1 — 2x Example

StarSample V2.0 Lite

This is a model for the restoration of My Little Pony: Friendship is Magic, however it also works decently well on similar art.

V2.0 greatly improves upon V1.0's dataset in every way, taking models from (realistically) only being viable at 1x, to now being far more competent at 2x, more so for the models trained with heavier architectures in this release.

Improvements come as a significantly better understanding of compressions, and partly architecturally/partly dataset improved handling of details and overall understanding of content, leading to less artifacting and "AI smudging". The dataset takes from a larger variety of sources, despite being smaller than V1.0 (when tiled V1.0 would be 71,876 pairs), due to being filtered for IQA scores and detail density. It also contains many thousands of image pairs manually created to cover areas where there wasn't sufficient information.

This release also includes "NS", or "No Scale" models, which are a better representation of my initial goal with StarSample, and (StarSample V2.0 NS) should provide great 1x restoration results with little apparent artifacting, even where the heavier 2x models can fail due to having to increase resolution.

  • 2x StarSample V2.0 HQ — (HAT-L)
  • 2x StarSample V2.0 — (ESRGAN)
  • 2x StarSample V2.0 Lite — (SPAN-S)THIS MODEL
  • 1x StarSample V2.0 NS — (ESRGAN)
  • 1x StarSample V2.0 Lite NS — (SPAN-S)

Github Release

ArchitectureSPAN
Scale2x
Size
48nf
Color Mode
LicenseCC-BY-NC-SA-4.0
Private use
Distribution
Modifications
Credit required
Same License
State Changes
No Liability & Warranty
Disclaimer
Date2026-02-12
DatasetHR = 4K GT uncompressed MLP: FiM episode frames + relevant uncompressed HR pairs to LR datasets /// LR = 1080p MLP: FiM episode frames sourced from YouTube in 3 different bitrates + custom MLP: FiM focal blur dataset + custom MLP: FiM GIF compression dataset in 3 different compression levels + custom MLP: FiM difficult details and other edge cases dataset + custom artificially-degraded MLP: FiM background dataset
Dataset size53560
Training iterations149
Training epochs500000
Training batch size16
Training HR size192
Training OTFNo

Collections that include this model

Similar Models