How To: Building Your Own Render Farm

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Draven35

Distinguished
Nov 7, 2008
806
0
19,010
[citation][nom]MonsterCookie[/nom]IN FACT, most real design/3D applications run under unixoid operating systems.[/citation]

Yes, but most (texture) painting software does not, several 3d tracking apps do not, 3d Studio Max does not, After Effects does not. They run under Windows and OSX. Most artists actually working doing animation are using Windows and/or OSX, even though some studios have implemented Linux... (some studios still use IRIX for some machines!)
 
G

Guest

Guest
Um, has anyone mentioned cloud computing yet? I have two older gen macs at home a G5 Quad and Intel Macbook Pro. I don't use either for rendering, just nodes on Amazon EC2. I just pay by the hour, little setup time, no hardware purchase, run as many nodes as Amazon lets me, and link them with the temporary DNS addresses. Seems to me that building renderfarms like this for smaller players isn't cost or labor effective. With Amazon EC2, the machines are virtualized, so Amazon maintains and upgrades the hardware.
 

Draven35

Distinguished
Nov 7, 2008
806
0
19,010
Cloud computing may have some future in this area, but right now I don't know if any of the cloud computing vendors are set up to handle it. It is an interesting prospect and something to keep an eye on. However, with the nature of cloud computing, some clients of animators may not be entirely satisfied with the security of their intellectual property... I've had a few that didn't care for the use of online render farms.
 
G

Guest

Guest
It would be nice to have one of the nodes connected to some graphics cards, added to a desktop,and play crysis or so games on it!
 
G

Guest

Guest
The next installment of Quake, Unreal, and Crysis will require at least an 8-core unit with Quad SLI Nvidia Just because.

You can expect:
Titanium Walls - non-destructible environments
Long load times
DRM
And 8-Dual layer discs full of streaming content.

The game will sale for 79.95 and 115.95 for the collectors tin.

In 1999 a PS2 could render a game with great graphics and a destructible environment. What kind of processor was that?

Lies,

Sale Sale Sale :)
 

MonsterCookie

Distinguished
Jan 30, 2009
56
0
18,630
[citation][nom]Draven35[/nom]Yes, but most (texture) painting software does not, several 3d tracking apps do not, 3d Studio Max does not, After Effects does not. They run under Windows and OSX. Most artists actually working doing animation are using Windows and/or OSX, even though some studios have implemented Linux... (some studios still use IRIX for some machines!)[/citation]

Well, those software companies which do not have linux support are not serious, because most oldscool applications were made for unix, and YES, mostly for Sgi Irix. (I am really sorry for the Sgi company and their Irix operating system btw. Since i started working on an Sgi Octane, i became a big fan and collector of their machines, Tshirts and posters).

Another thing: since we are talking about a render farm, the applications mentioned abover are not really into this category.
Also, once your powray... whatsoever file is ready, you do not need windows to render it anymore.
For office apps M$ might be good for lazy people who do not use latex, but for real workstation applications it sucks big time.
My extreme philosophi: if something does not run on linux/unix, it was never ment to be used by humans.
It is only the question of time until people realize this.
I am not an M$ hater, but i just cannot stand those companies which try to force M$ products on you against your own will.
 

Draven35

Distinguished
Nov 7, 2008
806
0
19,010
Larger studios run Linux, because they can get the experienced support people to do it. Smaller studios and individual artists tend to run Windows, or less often OSX. I could sit here and rattle off which studios are primarily using Windows apps, which are using OSX, and which are using Lunix, and you'd see that the list gets progressively smaller with each operating system in the list.

Once your file is rendererd, if 3d, you usually tweak it in a composting application... which requires having one.

Most of the 3d applications (and just as importantly, the plugins for 3d applications) are for Windows and Mac. We're talking commercial 3d applications, (i.e Maya, 3D Studio Max, LightWave, SoftImage and Cinema4d, to name a few) not POVRay and Blender. Not that you can't do work using those, but using the 'standard applications' is very important if you want to be productive, or get a job doing 3d.
 

ossie

Distinguished
Aug 21, 2008
335
0
18,780
[citation][nom]Draven35[/nom]... Most of the 3d applications (and just as importantly, the plugins for 3d applications) are for Windows and Mac. We're talking commercial 3d applications, (i.e Maya, 3D Studio Max, LightWave, SoftImage and Cinema4d, to name a few) not POVRay and Blender... [/citation]
Maya has and had a native linux port for years.
For 3DSM, you could use Mental Ray on a linux render farm.
Lightwave had linux support for ScreamerNet until a while ago.
Softimage has also linux support.
C4D is available for linux, but only for select customers...

Please, get your facts straight.
We already know TH is pu$hing m$ products, so it's not necessary to underline it any more... you just loose credibility, by such unsubstantiated affirmations.
 

Draven35

Distinguished
Nov 7, 2008
806
0
19,010
I'm not pushing Microsoft products. I mentioned OSX a bunch of times. And yes, you could use linux based rendering if your renderer (and render controller) support it... hopefully your compositiing application does as well. And hopefully, any plugins you need as well... and the $140 plus or minus on each render node is a pittance compared to the $999 per system price for mental ray render nodes.
 

ossie

Distinguished
Aug 21, 2008
335
0
18,780
The problem isn't just the "140bucks/node", as you try to simplify it.
m$ $ucks big time in: scalability, memory/task management, networking, and lastly, but not least, $ecurity (and the list could go on...).
If you go the m$ route you're locked into one vendor, and, as the latest developments show, you'll get all kinds of "improvement$" (DRM-O$ anyone?) which are directly affecting system performance (of course, you could run the $erver version, which isn't so "enhanced"):
[citation][nom]Draven35[/nom]Since 64-bit Windows XP is still available through OEM channels, there is no need for you to even consider putting Vista on your nodes, wasting memory and processing power.[/citation]
Why do you believe that [citation][nom]Draven35[/nom]Larger studios run Linux[/citation]?
Do you really believe the big ones would scrimp on [citation][nom]Draven35[/nom]the $140 plus or minus on each render node[/citation]?
As for [citation][nom]Draven35[/nom]I'm not pushing Microsoft products[/citation]: I've already praised you for not doing it in your article, it was referring to the usual TH blurb$.
 

Draven35

Distinguished
Nov 7, 2008
806
0
19,010
Ok, ILM's renderfarm is primarily Linux. So is Sony Imageworks. And, from all reports, Rhythm and Hues. Pixar's is a mix of Unix and OSX. Digital Domain's renderfarm is alot of dual-boot systems. Alot of smaller studios are Windows or OSX simply because they want to stick to on e platform between their workstations and render nodes, and don't want to have to have a linux support person for their nodes. Zoic, EdenFX, many others are primarily Windows.
 

eodeo

Distinguished
May 29, 2007
717
0
19,010
FYI nvidia abandoned galeto not because it thought that GPU rendering is done and over. It did it because all 5 users of galeto wouldn't get upset with the news.

That and nvidia redirected its focus to GPGPU rending in mental ray as soon as nvidia bought mental images in late 2007- it abandoned galeto project as well.

So on the small scale it looks like a setback, while in fact, its a huge step forward. All highend 3d software supports mental ray. Most comes bundeled with it. Imagine rendering 100x faster for 10x less price- that is, if nvidia allows non quadro cards to do this. And even if they dont, someone will crack it, since there is no physical difference between quadro and their same-chip geforce variant.

To the rendering of the future -TWIMTBR- The Way Its Meant To Be Rendered.
 

Draven35

Distinguished
Nov 7, 2008
806
0
19,010
All highend 3d software supports mental ray. Most comes bundeled with it.

Lightwave doesn't. And for the software that does, some people still prefer renderers other than mental ray- Vray, Brazil, etc., and lest we forget, Renderman. Studios may and do use these renderers interchangeably. Many of the 'high end' studios have their own renderers (ILM, R&H, et al, and Pixar is using the 'next version' of Renderman)

100x faster is in doubt. Look at CPU vs. GPU-based encoding for video- you don't see a 100x improvment. GPU-based rendering does not render using the graphics card's engine, it renders using its own engine and uses the GPU to process instructions it is particularly suited for. Within a few years, rendering for today's television visual effects may be done on the GPU, but film will likely remain GPU-assisted for a bit longer. Remember that the lovely shots you see rendered real time in games are highly optimized- years worth of man-hours spent getting it to work within the engine.

A 'good' television visual effects shot these days will use area lights, global illumination w/ ambient occlusion, traced shadows, and anti-aliasing set much higher than 16x AA yields, at very high polygon counts. (Usually millions for 'hero' objects, the Galactica was around two million polygons.) Objects also tend to have shaders and textures in the color, diffuse, luminosity, specular, glossiness, reflection (if reflective), and bump channels. The texture maps themselves can be large and multilayered, 512 MB in and of themselves for a single model. And this is television, not feature films, and we're essentially discussing a hard surface model because I didn't even mention having to do things like sub-surface scattering to simulate skin, or the amount of polygons a good hair/fur simulation adds to the scene. Add on top of that the possibility of volumetrics and/or volumetric lighting in the scene, and the necessity of the objects to reflect that.

As for the amount of memory, yes, more is better. But if you're designing scenes on a machine with only 4GB, and have done full-up test renders on that machine, then you're likely to only need 4 GB on your nodes as well. But the idea was to present a budgetary and system baseline, and the article states as such under the assumption that you might know how much memory your scenes use while rendering.

But it boils down to making your own decisions.

Do you want a few render nodes to increase your productivity? (i.e. ability to get work done...)

Will it be worth the $140 per node to use the operating system you already know and use on your workstation, or is it cost-effective for you to learn how to administer a set of Linux nodes, including the a render controller that runs under Linux? (Note, at the low end of animator pay, a day and a half spent learning Linux is the cost of putting Windows on two nodes...)

Do all the renderers and plugins to that renderer that you use run under Linux? Will it cost you extra to get the version that runs under Linux? Will it cost you more per system than it would have to just put windows on the machine?

I could very well have spent an extra page discussing the pros and cons of putting Linux on your render nodes. At this point, i basically have.
 

Ryun

Distinguished
Oct 26, 2006
133
0
18,680
Only about half way through, and I'm really liking it so far. However, I needed to point something out:

[citation][nom]Article, Page 4[/nom]However, keep in mind that with a low thermal design power (TDP) processor, these systems should only consume about 140 W of power apiece at 100% utilization...[/citation]

While that may be the case the TDP is not the cause of the power consumption. This is a common misconception, but the TDP is the realistic maximum heat that is given off not power usage. This is why in this article here: http://www.tomshardware.com/reviews/intel-e7200-g31,2039-13.html the whole system at maximum uses less power than the 65 W TDP of the E7200.
 

MonsterCookie

Distinguished
Jan 30, 2009
56
0
18,630
[citation][nom]shapr[/nom]I recently purchased a used IBM BladeCenter and seven QS20 dual Cell blades on ebay for a total cost of $2500. That gets me about 2.5 Teraflops of rendering power, and adds about $50 to the monthly power bill in my apartment.There are cheaper options if you're willing to go a bit further afield.[/citation]

Woow, dont you mean 25.000$??
Honestly, for only 2500 it was a present even with a gift certificate.
Now i envy you.

Good luck with it man.
Do you have Infiniband interconnect between the nodes btw??
 

ossie

Distinguished
Aug 21, 2008
335
0
18,780
[citation][nom]Draven35[/nom]Alot of smaller studios are Windows or OSX simply because they want to stick to on e platform between their workstations and render nodes, and don't want to have to have a linux support person for their nodes.[/citation]
Again, you have a somewhat limited and distorted view of linux.
Linux is about freedom, it's a whole universe, covering the whole visible spectrum, and you could add the invisible also - IR and UV - as opposed to windblow$' blue(sod). It offers you the freedom to choose what fits you best, and to dig, as deep you like, in it's innards. If you praise more the button-pushing style, you'll find enough dumbed down distros to fit the bill. You don't have to be a linux guru to use them, but some knowledge background is, as always, recommended.
It is a widespread, and heavily supported (read $), misconception that windblow$ is easy to use and manage. The pu$hers will try to convince you that's as simple as "point-n-click" to do everything - apparently it works, if everything is as devised, but it will fail miserably if something in the surroundings wasn't accounted for - and the usual reaction is to "crap out" - m$ (r).
Also a GUI offers you a very limited number of options (just the ones built in at conception), and if your needs exceed them - hard luck: cryptic command line (a lot worse than on nix-es) or the dreaded registry hacks.
You are basically at the mercy of m$, if something goes wrong (eventually with the exception of very large customers, which can leverage some pressure - but don't count on it) - and, as the history proves, it repeatedly has gone, goes, and will go...

m$ and it's partner mafia are a lot like a drug operation. m$ products are like narcotics, the all powerful drug lords (m$) are flooding the market with "new" "enhanced" products, that just deepen the dependency. The partner mafia offers "protection" (av & co.) from the bad things out there (dangerous mainly because of the deep flaws of the "product", in the first place), and offers their own crappy products (most other windblow$ locked in sw), that of course can be only used with the "product", and need a "new" version every few years, or they'll cease to work. There are the pu$hers (marketing droids and "unbiased" mass-media) which will knock you down with a tsunami of ads and unlimited shameless praises (there goes a good part of the money you pay for the "product").
All these gangsters are after just one thing - to get rich on your hard earned money - and don't give a damn if their shitty "product" is hurting the customer, as long as it's cashing in - and they're scared to hell, that they'll loose control of the market, and won't spare the dirtiest tricks to eliminate the competition - they love monopolies.
If you feel good being an drug addict, the go the m$ route...
 

eodeo

Distinguished
May 29, 2007
717
0
19,010
Lightwave doesn't. (support mental ray)

There's always an exception.
And for the software that does, some people still prefer renderers other than mental ray- Vray, Brazil, etc., and lest we forget, Renderman.
Even the most hardcore people using renderer X will switch to MR if they can get similar result in 10x faster time. 2x faster isnt enough for a major switch. We know that from past experience. 10x faster is entirely attainable and people using MR as their main rendering engine (and there are plenty of those) are going to be the riders of the storm that's going to change the rendering market.

On top of that, lets not forget that GPUs are many hundred times faster than CPUs, and that nvidia made it their life goal to make this thing with MR work. Personally, when they're done, I dont doubt that we'll have near to 100x faster results. 100x faster is a game changer, hardcore renderer x user, or not. Minor differences in usage and quality are not worth 100x time to any company. It will open up a new market for them and put them firmly on top as the global leader in... many things. So, its not like they don't have a motive.

100x faster is in doubt. Look at CPU vs. GPU-based encoding for video- you don't see a 100x improvment.

Nvidia gave those developers SDKs for CUDA and let them play with their own engineers for a while. If you look at CPU/GPU utilization during these encodings, you'll see near 100% CPU usage and less than 10% GPU usage. What's really funny is that when you look at the results of these, you'll find that 10% GPU usage generally translates into near 10x faster renderings across the board.

If there's a reason for people to have a quad SLI overpriced quadro system in all of their render nodes, nvidia will find a way.

linux vs m$

linux, like any non m$ OS, has compatibility issues. For example, I know for a fact that I cant run my 3ds Max there. If I ever was into an OS thats good for playing music and surfing the web.. I'll probably just stick to the linux version that came preinstalled on my Asus board- the express gate. I can be online chatting and listening to music in 5 seconds. If, however, I want to actually work (as well), I'll have to wait a whole minute for the m$ boot. After that, I can do whatever the bleep I feel like. Not that bad of a trade off- so good in fact, that I've never had the time for the 5 seconds boot.

If linux really wants to take over, it should work on supporting most commonly used software. Change of an OS is big change enough. Changing every application you ever worked with isnt a good idea for most users.

I just googled "photoshop cs4 on linux". The very first page i ran into explained how to set it up with Wine. It concluded with "The graphics are a bit wonky sometimes (I’m also running compiz), but otherwise it works great."

Name me one (1) professional that would settle for "a bit wonky graphics sometimes"?

(not a flame, just facts)
 

MonsterCookie

Distinguished
Jan 30, 2009
56
0
18,630
You might missunderstood something.
Your softwares are supporting Win exclusive only just because they were EXLUSIVELY made for M$ (probably they are paid by Bill and not the Clinton version).

So, the problem is in the other way around.
Unix/Linux could run these 'cutting edge' applications at any time, but the companies do not allow their software to run on Linux on pourpose.
This is because some companies are afraid to lose their virginity (marketshare, and support money from M$).
 

Draven35

Distinguished
Nov 7, 2008
806
0
19,010
You might missunderstood something.
Your softwares are supporting Win exclusive only just because they were EXLUSIVELY made for M$ (probably they are paid by Bill and not the Clinton version).
My software isn't 'supporting Win exclusive'. In the case of my 3d software, it runs under Windows and OSX, and has in the past run on IRIX and Sun, and was originally developed for the Commodore Amiga. Most of the other apps I use are both Windows and OSX... alot of the plugins for my 3d software, however, are not.

If there's a reason for people to have a quad SLI overpriced quadro system in all of their render nodes, nvidia will find a way.

Actually, I'm quite sure they would rather sell them Telsas to attach all over the place instead of graphics cards.
 

anamaniac

Distinguished
Jan 7, 2009
2,447
0
19,790
Interesting read, though 3d or audio workstation use is something I will likely never do...
However...

Now I want to get 10 black cube cases, have 4 or more cores in each and connect them via a network, and have myself a pretty cool server. =D
 

MonsterCookie

Distinguished
Jan 30, 2009
56
0
18,630
I do not want to criticize anyone.
If they like it in Win way, and they can pay for it (both for the software and for a better machine for the same performance) than i can oly envy them.
IF they would sell their product 90% cheaper, it would even make some sense to buy it.

My wild dream in life still remains to get a Quad Sgi Tezro and install Win7 on it (the second part is a joke of course, but the first part is dead serious).
 

Torment

Distinguished
Aug 21, 2008
22
0
18,510
Performance/Watt needs to be cost analyzed for these systems, as well (including at idle). You can rack up a hell of an electricity bill running older processors. In a lot of cases, you're better off running fewer nodes on faster processors.
 

Draven35

Distinguished
Nov 7, 2008
806
0
19,010
Yes, that is true, but also you'd need to look at electrical rates... would it be worth spending $200 more up front to save $5 a month on your power bill?
 

Torment

Distinguished
Aug 21, 2008
22
0
18,510
@Draven35

Back of the envelope calculation:

Let's say you were using the Intel Core 2 Duo E6300. From the charts at http://www.tomshardware.com/charts/desktop-cpu-charts-q3-2008/Cinema-4D-Release-10,835.html, upgrading to the Core 2 Quad Q9650, you could replace 3.1 nodes for each upgraded node. That alone will save you money, but to my original point, if we assume that the old nodes run at 200W and the new node at 300W (since I'm too lazy to look up numbers), at 11.59 cents/KWH (national average april 09) , you'd save $325/yr in electricity. There will, of course, be a sweet spot, as always, but this readily shows that energy cost *must* be taken into account.
 
Status
Not open for further replies.