What could this computer handle for say the next 10yrs

This forum is for help with and discussion about your video hardware.

What could this computer handle for say the next 10yrs

Postby Rev. Takahashi Fan » Mon Dec 18, 2006 10:30 am

ImageImage
member of the "Pikachu must die!," club.
User avatar
Rev. Takahashi Fan
 
Joined: 21 Feb 2006
Location: Some Army Base

Postby DJ_Izumi » Mon Dec 18, 2006 10:48 am

It'll be 'acceptable' in 5 years. You pretty much can't expect to build a computer that will last you untill the 10 year mark while keeping it as a highly useful desktop. It's just not happening. Also, in 1-2 years you'll be able to buy what's in that machine for half of what you'd pay today.

Anyway, you're still overpaying for that machine, even if you want something 'top of the line'.

$6000 for a computer... Hah. I could build three scary machines for that much money.
Image
User avatar
DJ_Izumi
 
Joined: 03 Oct 2001
Location: Canada

Postby Keeper of Hellfire » Mon Dec 18, 2006 12:34 pm

Definitly overpaid since it uses an expensive FX-62, which is even tricked out by the much cheaper E6400. Check this test. If you want a sytem that that you can use in 10 years for lets say Internet browsing, get the fastest Intel Quad-Core CPU that's available in the moment.
User avatar
Keeper of Hellfire
 
Joined: 09 Jan 2005
Location: Germany

Postby DJ_Izumi » Mon Dec 18, 2006 12:36 pm

If you want a machine you can use in 10 years, you spend $1k-2k on a computer now and put the rest in the bank so you can buy a new computer later. |:
Image
User avatar
DJ_Izumi
 
Joined: 03 Oct 2001
Location: Canada

Postby Kariudo » Mon Dec 18, 2006 1:58 pm

Alienware is nice...but it's like paying $3 a gallon for gas when you can make it yourself for $1 a gallon

for the same price (ok...it's $230 cheaper...not counting mail in rebates) here's what you can get from newegg:
Thernaltake case (full tower atx)
eVGA mobo with nvidia 680i chipset
quad core intel cpu (2.66 GHz)
2GB corsair XMS2 dual-channel ddr2 800 5-5-5-12 @ 1.95v
2x GeForce 8800GTX
Creative sound blaster audigy 2 ZS platinum 7.1 channel (with some other nice things)
1000W psu with 4 12v rails (sli capable)
3x Seagate 320GB 7200rpm sata2 HD's
24" monitor 1900x1200 with lots of inputs (including HDMI)
blu-ray burner
Logitech G15 keyboard
Logitech G5 laser mouse (wired)

and windows xp pro with upgrade cupon for vista

how's that for 1-up? (heck...you could probably add a nice water cooling system for your cpu, chipset and gpu's and still be under the alienware price)

but in all practicality...just get a decent setup for now, and then do it again 4-5 years down the road. (so pretty much what everyone else said)
ImageImage
Image
User avatar
Kariudo
Twilight prince
 
Joined: 15 Jul 2005
Location: Los taquitos unidos
Status: 1924 bots banned and counting!

Postby DJ_Izumi » Mon Dec 18, 2006 2:06 pm

Still, even a Quad Core Core proc likely wouldn't last you 10 years. Look back 10 years and considder what the 'best' computer you could get then was. Even if we had an unlimited budget we probably couldn't build a desktop system using componants of the day capable of running most of the software we use today without crapping itself. Even the Pentium II wasn't out 10 years ago. You'd be building a P1 multi-CPU work station like system, with as much EDO ram as you could put in it and, powered by 3D Rage (Rage I) graphics... Or something comparable.
Image
User avatar
DJ_Izumi
 
Joined: 03 Oct 2001
Location: Canada

Postby BasharOfTheAges » Mon Dec 18, 2006 2:35 pm

Am I the only one under the impression we've hit a wall in terms of marketing the next big thing? Sure up to about 5 years ago real progress was made to bring to the forefront all the innovation we could theoretically use on a day to day basis. For the last few years mobile technologies have gotten smaller and smaller and storage capacities have gotten larger and larger, but it seems like it should ease up or slow down soon. The average consumer might not be able to see that the system they have is 10x more powerful than they need it to be to do what they do with it now, but when it's 100x better?

I think it'll be hard to fund groundbreaking technologies and innovation when all you are marketing to is top-of-the-line niche groups that make CG movies or whatnot. The average Joe will never need a CPU designed for working on the collective data requirements of an entire business, when he's just surfing the web. It being "the thing" is a mentality that works up until the sheer ridiculousness of having an 8 core processor and 16GB or RAM to check your email and browse the web is what's being offered to you.

Even for high-end non business applications, there's a definite horizon coming up unless major innovation occurs. There's a limit to how real something can look or how big of a monitor will actually fit on your wall. Graphics better be getting 3d-immersive some time soon or there's going to be nowhere to go. Sure things can get minutely better, but 99% of all people wouldn't be able to tell the difference. There’s a finite amount of steps you can go down a path that's meant to compare rendered creations to reality and once you get really close nobody's going to notice if you take that extra step. The human brain can't see those minute differences.
Another Anime Convention AMV Contest Coordinator 2008-2014 & Head of the AAC Fan-works Theater - follow us on Twitter: https://twitter.com/#!/AACFanTheater
:sorcerer: :sorcerer: |RD: "Oh, Action!" (side-by-side) | |
User avatar
BasharOfTheAges
Just zis guy, you know?
 
Joined: 14 Sep 2004
Location: Merrimack, NH
Status: Extreeeeeeeeeme

Postby Kariudo » Mon Dec 18, 2006 3:09 pm

it could be getting to that point, but the hardest part of invention is thinking of something new.

have we hit a wall for other electronic items such as data storage? (ie dvds, blu-ray)
the advent of blu-ray isn't really too new, a more powerful (shorter wavelength) laser and a slightly different storage medium...but what's after blu-ray? (holographic disk storage perhaps? but then what's after that?)

we don't know yet, and when we think that there's nowhere else to go in this area, something new will come out.
But there is still a limit for this (I imagine the best possible technology for this type of thing to be something like sending/recieving signals without wires to/from the brain to show video directly in the vision areas of the brain without the need for eyes)

the computer as we know it could very well be ending its evelutionary cycle, thus making room for the next "generation"of computers (once again, something like mind-linking or a true AI. why am I thinking about chobits for the second one?)

I'd like to see the day that a human brain can be suplemented with a cpu.

New technologies (I mean really new, groundbreaking, nobel prize winning stuff) seem to come in groups with a key invention leading the way.
eventually, some other common use of a computer will come out that will require 8 cores and 16GB of ram and a host of other things. (or else it wouldn't be economically feasable to develop such things)

this is kinda why I like reading <s>Nerdular Nerdence</s> Popular Science. I can marvel at things that might be coming out, groundbreaking tech. (For leisure, I guess feeling over my head makes me feel good since I wasn't really challenged by school until just recently).
One of my recent favorites was an article about using a virus to catylize the reaction to make an electric potential difference across two surfaces (ie a virus battery). The claim on this was that a small piece of "tape" (think about the size of a typical piece of tape that you take from a magic tape dispenser) of this stuff could provide as much power as a 12v Li-ion battery found on most laptops today
ImageImage
Image
User avatar
Kariudo
Twilight prince
 
Joined: 15 Jul 2005
Location: Los taquitos unidos
Status: 1924 bots banned and counting!

Postby Keeper of Hellfire » Mon Dec 18, 2006 3:12 pm

BasharOfTheAges wrote:It being "the thing" is a mentality that works up until the sheer ridiculousness of having an 8 core processor and 16GB or RAM to check your email and browse the web is what's being offered to you.
You underestimate Microsoft. Wait 3...4 more Windows generations, and this will be the minimum requirements to run windows. :D And newer technologies at the internet will require newer browser and email-proggies - which require the latest Windows.

Never say never in technological development. If I think back at my first PC - 1MB RAM - that's the average L2 cache of an actual CPU. A 20 MB HDD as "mass storage device" - hell, even my graphics card has much more memory, and at that times I couldn't imagine to fill it ever. 20 MB - that's around 2 seconds HuffYUV clip.

My first data connection was at 2400kBit/s, now I'm sitting behind 1MBit/s DSL. What do you think will I have in 10 years? I'm confident it's 1GBit/s.
User avatar
Keeper of Hellfire
 
Joined: 09 Jan 2005
Location: Germany

Postby DJ_Izumi » Mon Dec 18, 2006 3:24 pm

Keeper of Hellfire wrote:My first data connection was at 2400kBit/s, now I'm sitting behind 1MBit/s DSL. What do you think will I have in 10 years? I'm confident it's 1GBit/s.
I dunno about you but the speed I have access to via Cable Internet hasn't really changed in the last 5.5 years since I got it. Some variations as I've changed providers but my experiance hasn't seen any dramatic increase. Adding to that, it's not often that I even max out my existing connection.
Image
User avatar
DJ_Izumi
 
Joined: 03 Oct 2001
Location: Canada

Postby Kalium » Mon Dec 18, 2006 3:34 pm

Kariudo wrote:I'd like to see the day that a human brain can be suplemented with a cpu.

That happened as soon as spreadsheets, databases, computer graphics, and word processing (among other things) were invented. In other words, a long time ago.
User avatar
Kalium
Sir Bugsalot
 
Joined: 03 Oct 2003
Location: Plymouth, Michigan

Postby BasharOfTheAges » Mon Dec 18, 2006 3:58 pm

Keeper of Hellfire wrote:
BasharOfTheAges wrote:It being "the thing" is a mentality that works up until the sheer ridiculousness of having an 8 core processor and 16GB or RAM to check your email and browse the web is what's being offered to you.
You underestimate Microsoft. Wait 3...4 more Windows generations, and this will be the minimum requirements to run windows. :D And newer technologies at the internet will require newer browser and email-proggies - which require the latest Windows.


Microsoft leading the way with intentionally bloaty software to sell more top-of-the-line computers? I believe it. But as for it really being nessisary, you're just proving my point. It isn't and won't be. I think eventually people will realize that.

Keeper of Hellfire wrote:My first data connection was at 2400kBit/s, now I'm sitting behind 1MBit/s DSL. What do you think will I have in 10 years? I'm confident it's 1GBit/s.


If you look at certain Asian markets you'll see that speeds that fast already exist. No ISP in a major market (the U.S., most of europe, etc.) will touch that though. They know 1 in a million people would use that kind of bandwidth potential and most people would simply share connections, cutting into their profits significantly.

I have no doubt (as Kariudo pointed out) that new inventiosn will come, but by and by those will most likely not be desktop computers - rather something more. I think most of the new invention will be in I/O though I know the "numbers game" (bigger = better, amirite) will still be marketed for quite some time. There's a ceiling for our current method of progression not in how much bigger and better we CAN make it, but it how much that will really matter with properly designed software. No sence intentionally making bloted O(2^n) algorythms just because you have a processor that can figure it out anyways...
Another Anime Convention AMV Contest Coordinator 2008-2014 & Head of the AAC Fan-works Theater - follow us on Twitter: https://twitter.com/#!/AACFanTheater
:sorcerer: :sorcerer: |RD: "Oh, Action!" (side-by-side) | |
User avatar
BasharOfTheAges
Just zis guy, you know?
 
Joined: 14 Sep 2004
Location: Merrimack, NH
Status: Extreeeeeeeeeme

Postby Kariudo » Mon Dec 18, 2006 4:11 pm

Kalium wrote:
Kariudo wrote:I'd like to see the day that a human brain can be suplemented with a cpu.

That happened as soon as spreadsheets, databases, computer graphics, and word processing (among other things) were invented. In other words, a long time ago.

by that I meant that a person could add a cpu to their brain, sending commands from the brain (without the need for apendages like these fingers that take a lot of time to coordinate to move in the correct pattern to get the desired result) to carry out simple comparisons and calculations at blinding speed. (comparitavely)

If I told you to compute the value of 7^6, it would take a little while, but it's possible. A cpu can give you the answer much much faster(I know it takes a bit more than just the cpu to do this...but my point still stands)
at 2GHz and 16,807 additions (assuming 8 "operations" per addition. 2 to read/store from a memory address and 2 to perform the addition...I don't know how it really works yet) it would take ~0.000067228 seconds to get the answer
ImageImage
Image
User avatar
Kariudo
Twilight prince
 
Joined: 15 Jul 2005
Location: Los taquitos unidos
Status: 1924 bots banned and counting!

Postby DJ_Izumi » Mon Dec 18, 2006 4:16 pm

Or... You could use a calculator. o.O

I think my idea works much better.

Let's not even touch on the topic of you suggesting something as wasteful as using a 2.0ghz processor to do high school math. A Zilog Z80 would be overkill. How would you even propose to POWER such a thing? It'd need external batteries cause you sure as hell can't leech enough bioelectric energy out of a human body to run a 2ghz CPU. :/

Please, tell me you don't actually think that pile of sci-fi crap is a good idea.
Image
User avatar
DJ_Izumi
 
Joined: 03 Oct 2001
Location: Canada

Postby Kariudo » Mon Dec 18, 2006 4:45 pm

DJ_Izumi wrote:Let's not even touch on the topic of you suggesting something as wasteful as using a 2.0ghz processor to do high school math.

it was just an example...geez. I didn't really want to go too far into something like how it could find taylor polynomials for functions/series and other things of the sort. (improve engineering by eliminating a common source of human error, poor approximation...and being able to get better approximations for more complicated situations)

How would you even propose to POWER such a thing?

for the power...didn't think of that, but by the time that's feasable the cpu process size will probably be in picometers or femtometers...and power consumption would also go down. hey! what about that virus battery I was talking about hmm?

Or... You could use a calculator. o.O

calculator? pshaw! what with the use of the human brain for simple calculations and reliance on fingers, the ammount of time wasted in transit is enormous (and error-prone).

Please, tell me you don't actually think that pile of sci-fi crap is a good idea.

part of my mentioning that was me thinking it was a good idea.
I don't tell you how to dream...
<_<
ImageImage
Image
User avatar
Kariudo
Twilight prince
 
Joined: 15 Jul 2005
Location: Los taquitos unidos
Status: 1924 bots banned and counting!

Next

Return to Video Hardware Discussion

Who is online

Users browsing this forum: No registered users and 0 guests