I am a Linux community newbie and this is my first post.
I'm an engineer and work for a corporation which, despite it's many commendable merits, is constrained by a draconian IT policy that imposes upon us only the option of using an archaic 32-bit Windows XP OS. Not to mention the wasted productivity and needless cost when a far superior Open Source OS is readily available, being forced to use 32-bit Windows XP is a major impediment preventing me from using the computer resources I have available to run the massive 3D modelling simulation problems I want to in order to better accomplish my work. I appreciate our local IT guys are only enforcing the policies they've been given, probably without being asked their opinion on them, and I hope to find a work-around solution without rocking the boat.
Here's Plan A:
I understand Linux can be loaded onto a USB Flash drive and you can run the entire OS from the USB drive without ever actually installing it on the computer's disk. The programs I want to run only require me to initiate them and direct them to the appropriate data files they'll read with commands through the shell; the programs don't require any GUI or need to be installed. Once the programs are running, it's just a matter of a massive amount of number crunching for up to a few days using all the available CPU and RAM. After the process is completed, I could transfer the computed data back to my Windows Laptop on the USB stick and then do all the visualization and analysis with the fancy commercial programs I have for Windows.
So, my first question is whether this Plan A is feasible. I know that Linux would be able to run much better properly installed on the desktop's disk, rather than running it off the USB drive, but how adversely would this affect Linux's ability to fully utilize the computer's RAM and distribute computation among the 4 CPU on the Xeon chip? It may be worth mentioning that the limiting factor on the size of problems I can run has a lot more to do with the amount of RAM that can be accessed, rather than the time required or speed/number of CPU.
The next question is, if running big computational problems with Linux on a USB is feasible and worth pursuing, which would be the best Linux distribution with which to do it? Does it really matter which 64-bit distribution I'd use, or does it all just depend on the kernel? I know the merits of different distribution scan bring up controversy and I hope that doesn't distract too much from the main reason for my post.
Based on the web research I've done so far, I'm leaning towards Linux Mint or Ubuntu because they're both popular and reported to be good choices for newbies. My friend runs big computational problems like the ones I want to on a Linux Beowulf cluster under CentOS. He may have gone with CentOS for its network capabilities, which wouldn't apply so much to me because I just want to run the programs on a standalone desktop PC using all of its local CPU and RAM. Despite that, it makes me think CentOS is a good way to go just because it's what he's been using to run the same programs for years.
I thank the Community in advance for their thoughtful advice and support.