Overclock.net banner

1 - 10 of 10 Posts

·
Registered
Joined
·
168 Posts
Discussion Starter #1
Im about to dive into Linux in my IT classes at school. so i figured i would build a linux system at home to learn on. Also want a server at home to work with for the same reasons so i chose Fedora/Amahi to play around with.. Now to my question.

My system im going to use wont have more than 4gb of ram so is there any advantage to installing 64bit when dealing with the server side of things?

Thanks in advance..
 

·
Premium Member
Joined
·
13,477 Posts
It may perform slightly better, but may also use more memory and disk space. Since you haven't said anything about the hardware I'm not going to make a recommendation.
 

·
Registered
Joined
·
168 Posts
Discussion Starter #3
P4 630 (3ghz, 2mb L2, HT) 2gb of ram, gs8300 gpu, and probably about 250gb worth of storage. Some of that may change as im building the system with parts outta my "old computer parts box" lol. i find new stuff every time i go through it.
 

·
Expand Always in Always
Joined
·
4,596 Posts
I'd also just go with the 32-bit OS for now.I've run my Amahi server with 2GB's,it's recommended that you have 512MB's so 2GB's will be more than enough.As for the advantages of the 64-bit OS,I think it would depend on what your planning to do.
 

·
Premium Member
Joined
·
13,477 Posts
Quote:


Originally Posted by nil405
View Post

P4 630 (3ghz, 2mb L2, HT) 2gb of ram, gs8300 gpu, and probably about 250gb worth of storage. Some of that may change as im building the system with parts outta my "old computer parts box" lol. i find new stuff every time i go through it.

It's 64 bit capable but I probably wouldn't bother unless you intend to have more than 3.5GB of RAM someday.

Quote:


Originally Posted by Squeeky
View Post

32bit i would, 64 bit adds extra complications but if your learning got 32 any learn about 64 bit later. 64 bit has a few issues with programs needing 32bit libs etc it can get a bit more confusing

For a server build this is pretty much irrelevant.
 

·
Registered
Joined
·
712 Posts
Quote:


Originally Posted by error10
View Post

It may perform slightly better, but may also use more memory and disk space. Since you haven't said anything about the hardware I'm not going to make a recommendation.

well, i know you said "may" ... but often, 64bit binaries are going to be slower than 32bit... data structures are often bigger and load times into the CPU registers cost more with 64bit. you can gain performance if you're doing some large calculations that utilize the larger data handling...

to the OP, I think you can just start with a 32bit OS.

But, i don't know what your IT class is going to teach... but I might suggest using an enterprise Linux distro instead; it will ultimately be more useful in the work place. If your IT class is more about programming on the Linux platform, then i may not matter at all. But if you're going to be learning about system administration, then I would choose a enterprise distro.
 

·
Premium Member
Joined
·
13,477 Posts
Quote:


Originally Posted by BLinux
View Post

well, i know you said "may" ... but often, 64bit binaries are going to be slower than 32bit... data structures are often bigger and load times into the CPU registers cost more with 64bit. you can gain performance if you're doing some large calculations that utilize the larger data handling...

I keep hearing things like this, and it doesn't make a lot of sense. Got some evidence of this?
 

·
Registered
Joined
·
712 Posts
Quote:


Originally Posted by error10
View Post

I keep hearing things like this, and it doesn't make a lot of sense. Got some evidence of this?

this isn't anything new really... we saw the same thing when intel was migrating from 16bit to 32bit during the 286 -> 386 transition. (my age is showing in this conversation huh?)

you need to have a little background in programming with strongly typed languages and maybe a basic understanding of assembly. if you're loading a 64bit integer type when you're only using the lower 32bits (or less), you're doing extra copying of "0"'s into the registers of the processors. the instruction sets are also larger and pointers are larger too.. that's why the executable binaries are larger for 64bit vs 32bit code. It's as simple as the fact that you need to do more work copying larger data types, larger pointers, larger instructions to run 64bit vs 32bit code. even reading a 1MB file that is a 64bit binary executable off disk is going to take longer than reading the 32bit version which might be 512KB. i think you get it...

in the bigger picture of things, it doesn't really matter because processor technologies get faster and faster and people soon forget the "slow down" ... and sooner or later, programmers start writing code that can advantage of the larger pointers (if they need to address more memory for the program, remember the 4GB limit in 32bit Intel ?), start handling data that actually need the 64bit data types, etc. Once you reach that stage, you're going to be faster than the older "smaller" architecture because handling a 64bit integer on a 32bit platform requires "double" the work (put in oversimplified terms).
 

·
Premium Member
Joined
·
13,477 Posts
Quote:


Originally Posted by BLinux
View Post

this isn't anything new really... we saw the same thing when intel was migrating from 16bit to 32bit during the 286 -> 386 transition. (my age is showing in this conversation huh?)

you need to have a little background in programming with strongly typed languages and maybe a basic understanding of assembly. if you're loading a 64bit integer type when you're only using the lower 32bits (or less), you're doing extra copying of "0"'s into the registers of the processors. the instruction sets are also larger and pointers are larger too.. that's why the executable binaries are larger for 64bit vs 32bit code. It's as simple as the fact that you need to do more work copying larger data types, larger pointers, larger instructions to run 64bit vs 32bit code. even reading a 1MB file that is a 64bit binary executable off disk is going to take longer than reading the 32bit version which might be 512KB. i think you get it...

in the bigger picture of things, it doesn't really matter because processor technologies get faster and faster and people soon forget the "slow down" ... and sooner or later, programmers start writing code that can advantage of the larger pointers (if they need to address more memory for the program, remember the 4GB limit in 32bit Intel ?), start handling data that actually need the 64bit data types, etc. Once you reach that stage, you're going to be faster than the older "smaller" architecture because handling a 64bit integer on a 32bit platform requires "double" the work (put in oversimplified terms).

OK, now that all makes sense, though a compiler which knows how to optimize for 64-bit can solve most of that. And most code written for 32-bit needs to be cleaned up to work compiled as 64-bit anyway.

And really, if a 64-bit binary is 1MB, then the 32-bit binary is more likely to have been about 800KB. Often the programs are virtually the same size. (The nice thing about being on a multilib system is I can just go take a look
)

Overall, I don't think you're going to notice any slowdown outside of synthetic benchmarks.
 
1 - 10 of 10 Posts
Top