Question Concerning Cinema

jnjnilson6

Distinguished
This would be a vague question qualifying aesthetically to the depths of glorious enchantment spread over the screen.

I would be glad if anyone can provide information concerning the hardware, most importantly the CPUs and GPUs engaged into pressure for delivering the devious and entertaining animations we find more and more within current times produced by the big Hollywood companies like Universal, 20th/21st Century Fox, Columbia Pictures, etc...

I am asking if you have any information regarding the specifications of the supercomputers employed within these daunting animating spectrums from the beginning of the 21st century up until today. Any specifications in the aforementioned period would be welcome within the timeframe they were employed. And, if possible, the particular film they were exerted towards may be appended in regards to clarity. This would be very much appreciated.

Do write up and

Thank you!
 

Eximo

Titan
Ambassador
I know about the older ones where they used silicon graphics workstations for like Toy Story and stuff. And the GUI they use in Jurassic Park is the real Unix OS that the Silicon Graphics workstations actually used.

These days, just pick a data center GPU, multiply it by several hundred/thousand, and you have the basic answer for the render farms used in recent films and digital animations.

Did catch an article about the Super Mario movie. Illumination apparently uses IBM servers of this type:

 
This would be a vague question qualifying aesthetically to the depths of glorious enchantment spread over the screen.

I would be glad if anyone can provide information concerning the hardware, most importantly the CPUs and GPUs engaged into pressure for delivering the devious and entertaining animations we find more and more within current times produced by the big Hollywood companies like Universal, 20th/21st Century Fox, Columbia Pictures, etc...

I am asking if you have any information regarding the specifications of the supercomputers employed within these daunting animating spectrums from the beginning of the 21st century up until today. Any specifications in the aforementioned period would be welcome within the timeframe they were employed. And, if possible, the particular film they were exerted towards may be appended in regards to clarity. This would be very much appreciated.

Do write up and

Thank you!
Hey there,

What a wonderful way to ask this question :)

You night try this: https://boords.com/blog/filmmaking-101-what-is-cgi-in-movies-and-animation
 
  • Like
Reactions: jnjnilson6

Eximo

Titan
Ambassador
On the software side in recent years, studios have started using game engines, especially for TV production stuff. For example The Mandalorian uses Unreal Engine for CG backgrounds. Pixar has also been using NVIDIA's technology for a while now.

At that point, hardware is just throwing as many high-end workstation cards as you can afford at it.

It is actually quite amazing what you can manage with even a single 4090 when using Unreal for cinematics. Not to mention Maya, Blender, AfterEffects etc for more traditional CGI. Some of the AI models (auto rotoscope and camera tracking) and simulations for smoke, water, fire, fabric all can run quite easily on a powerful workstation these days.