Buying a server for a neural net

I guess I was under the impression I could show up with a big computer with a bow on it. Very good points though, we should figure out if we can CPU/GPU first and so some small scale tests on something like AWS.

Threadripper sounds fun!

Yeah, sounds like you’re wanting a Class IX Gibson with liquid-cooled blast processing modules and a gigabaud intertube connection.

Ok then.

Edit: Sorry, I never saw Hackers.

Thank you. Was immediately reminded of this:

Yes. That was the reference he made.

Yes. And I’m happy he made it.

So the new Vega with the 2TB SSD built in is pretty much specialized for scrubbing through 4k or 8k video footage in bulk.

I am pleased you’re happy. And good day to you, sir.

His answer on the stack/model is Keras with a Tensorflow backend. I’m not sure how that affects the CPU vs. GPU debate. Also I’m looking at “cloud” (grrrr, why can’t we just call it scaled server-side storage/processing) processing, I need to figure out how they charge vs. what our needs will be. I really want to control the thing completely though, so at the very least I’m thinking maybe some $3000 Dell server we can crunch on. I can use that same server for test renderings when it’s not in use.

Solid advice. Stick with the cloud, Guap. Your time to startup/shutdown/pivot are all ideal that way. You aren’t bound to a specific vendor of hardware, nor support thereof. You aren’t stuck with old hardware, nor support thereof. Your only caveat is cost, which can be mitigated with approaching what you feed to a cloud solution. Time spent on processed shit is still time you pay for. But not doing anything means much lower costs.

Or do it all in the browser! :)

https://tenso.rs/demos/fast-neural-style/

Conglomerated Logisticly Offpremise User-rented Devices

Very firmly puts you in the GPU category, as expected. If you want to build your own and fill it with 1070/1080 nvidias and not worry about setting up the cloud accounts that’s perfectly viable, but it does have disadvantages.

Be aware installing Linux and the software stack, while not rocket science, is filled with landmines. It’s all moving fast and I had a heck of a time getting keras and Theanos working, issues with it not deallocating gpu memory without a reboot and other annoying issues. Tensorflow looked better in retrospect. The cloud kind of removes this aspect by having prebuilt images ready to go for certain stacks.

Thanks for all the advice, I think you guys talked me out of getting a server, as did he. It’s not going to be practical from a storage, processing, maintenance, security, or cost point of view. So pretty much a bad idea.

It’s a changing world where everything is a SAAS and only the people who run the servers actually buy the servers. I guess the old fart in me wanted a server room, but the entrepreneur in me loves the fact I can just pay for scalable computing and storage.

We are looking at both FloydHub or Heroku. Which is funny, because FloydHub’s tagline is “Heroku for Deep Learning”.

Our next step is gathering enormous data sets but that’s way off topic for this sub. Thanks again for the advice guys!

If only it were an acronym. That’s not a bad shot at it though. :)