Skip to content

Blog

Why Character Artists Are Embracing AI (And Why You Should Too)

The Rapid Growth of the IP Character Market

The IP character market is rapidly growing, and we all know the character entertainment market is massive. According to Overnight lines, mall fights and instant sellouts: Labubu toy mania comes to America Pop Mart reported record revenue of $638.5 million for the first half of 2024, a 62% increase over the same period a year earlier. Sales in its burgeoning North America segment totaled $24.9 million.

You’ve likely seen some of Pop Mart’s famous IP characters, such as Labubu, Molly, and Bobo & Coco. These IP characters are driving the creation of numerous derivatives, including toys, NFTs, avatars, digital twins, virtual pets, and 3D-printed artifacts.

An Empirical Study of Neural Network Training Dynamics

In the paper Distribution Density, Tails, and Outliers in Machine Learning: Metrics and Applications, the authors proposed several metrics to quantify examples by how well-represented they are in the underlying distribution.

After reading the paper, I started wondering: When humans learn, we typically begin with easy materials and questions, gradually progressing to more difficult topics. Does neural network learning follow a similar pattern? Do networks learn easy or well-represented examples first and move on to more complex ones later?

To address the question, I trained a simple fully-connected neural network with a single hidden layer consisting of 8 units on the MNIST dataset. To ensure more reliable and less noisy observations, I incorporated an evaluation step after each iteration. During this step, the network processed the entire training set, performed backpropagation, and recorded the prediction results and gradients of the second fully connected layer (an 8 × 10 matrix) for every example, without updating the network parameters.