Most of my regular readers can (and will) skip this one.
Basically, I scoured the net for reviews for my new video card before I bought it. I couldn’t actually find a ‘professional’ review, and was stuck with those terrible, terrible ‘user’ reviews you find on the Circuit City, Best Buy or CompUSA.
Those reviews are always written by one of two types of people. The people who’ve never owned a PC before, and think that everything their decidedly mediocre card can do is absolutely amazing…and the people who’ve had a bad experience with the company, or didn’t read the minimum system requirements thoroughly, and are calling a great piece of equipment crap, because it won’t work on their system.
So, considering I got hundreds of hits from people searching for reviews on the Graphics Pad I bought, I’ll try and write a decent review here.
Ok, Here’s a review of the Visiontek Xstacy (See? Put an X at the start, the kids’ll love it) Ati Radeon X1300 512mb PCI Express Graphics Card.
The first thing to mention about this card is the price. We appear to have three choices right now, buy a $50 card that won’t run anything, a middle-road card like the X1300 for around $100 - $200, or go to the other end of the spectrum and buy a $600 graphics card.
As a little hint, CompUSA has the 256meg version of this card for $150, but you can get the 512meg version from Circuit City for $200, but with a $50 mail in rebate. If you can spare the extra $50 for a few weeks, go for the 512 meg version. Double the memory, for (eventually) the same price.
Let’s start with the installation. I’ve had problems with Graphics cards before, so I always expect the worst when I install one. Luckily, the X1300 is an absolute breeze to install, especially if you’re replacing an ATI card, as they use the same drivers.
Installation went like this for me:
Turn off the computer.
Pull the side off the case.
Remove the back plate for the PCIe slot.
Snap the card into place.
Close the case.
Then I started the computer, Windows XP detected the new cart, told me to restart, which I did…and I was up and running.
Of course, if you aren’t replacing an Ati card, you simply put the card in, and when Windows detects it, put in the supplied driver CD.
(A hint for new users – If your current card is an onboard card, (that is a ‘card’ that is integrated into the motherboard), you need to disable it before you install the new card. Turn on your computer, press F2 to get into the BIOS, go to ‘integrated peripherals’, and set the ‘Graphics Init First’ to ‘PCIe slot’. Basically, this tells your computer to check the PCIe slot for a graphics card before it starts up the onboard card. Simple.)
Now we move onto performance.
Now, this is a budget card, so don’t expect absolutely blistering performance. On the other hand, for a budget card, you get a lot for your money.
Now, I’m running this card inside an AMD Athlon 64 2.3ghz, with 512meg of memory. Not exactly a hardcore gaming machine, but this card would run Doom 3 at 1024 x 768, with all graphics options as high as they would go, at a very smooth 40fps. It also ran Call of Duty 2, at the same resolution, again with all the screen effects and graphics options as high as they would go, at about 50 fps.
The only slight problem I encountered was that Doom 3 would occasionally freeze for a second every 10 minutes or so. However, a little experimenting showed that this was happening when the computer was accessing the swap file. (The swap file is an area on your hard disc that your computer uses as memory when your system memory runs out). In other words, my low half-gig system memory was acting as a bottleneck, it had nothing to do with the card itself.
The one game that amazed me was The Elder Scrolls 4 : Oblivion. With all the graphical options as high as they go, I got a playable frame rate (around 20 fps), when running at 800 x 600. Bear in mind that the 20fps is the minimum frame rate it would drop to, and that was when you’re out in the wilderness, with the graphics card having to render a few hundred thousand trees. If you’re in a dungeon, or in a town, the frame rate jumps up considerably.
Now, I know 20fps is not exactly blistering, but it’s definitely a playable frame rate. But bear in mind that people have reported stutters and occasional low frame rates on much more powerful cards. For example, Tim Buckley of ‘Ctrl-alt-del’ reported stutters, and he was running the game on not one, but two Nvidia Geforce 7800GTX cards, and he had 4 gigabytes of XMS RAM (For comparison, he has four gigabytes of memory, I have half a gigabyte of memory).
In other words, Oblivion is one of the most graphics-intensive games on the PC right now. The fact that a budget card can run it at all is amazing. My old card, the onboard Radeon Xpress 200, managed to run the game at about a frame every 6 seconds, with all the graphics options as low as they would go, at 640 x 480.
Basically, if you want a nice high frame rate all the time on this game with an X1300, just turn the screen effects off. The Bloom lighting and HDR effects look nice, but the game still looks great without them! However, you don’t actually need to, as the game is always at a playable frame rate.
Basically, this card will run any game you throw at it. You might not be able to have everything as high as it will go, but if you’re on a budget, like myself, you’re probably used to this by now.
Now onto the extra whistles and bells.
The number one great thing is that this card is crossfire enabled. In other words, if you have two PCIe slots, you can throw another X1300 in there, and run them both at the same time, meaning you’re doubling your graphics memory and processing power at a stroke. In short, if you run two of them, you’re getting the power of a $600 card, for $300 (if you go the circuit city route, and actually remember to send off for your rebate).
Now, I know I’ll probably be corrected by someone, as I haven’t experimented with two cards at once since the very first 3DFX cards, but if you’re running a game at 40fps, with everything as high as it will go, two of these will give you 80fps, or still give you 40fps at a much higher resolution.
Other than that, this card also comes with video out, and comes with the cables supplied for RCA and s-video…and very impressively, it can also send out a HD signal. So if you’re one of those rich people with the 70 inch HD TV, you’ll need to buy an adapter, but you can hook it right up to your TV. (Of course, if you can afford a 70 inch plasma, you can probably also afford an $800 X1900 card and won’t be reading this).
I only have two very minor bad things to say about this card, and they really are minor niggles.
The first is that this card (like all powerful GPU’s) puts off quite a lot of heat. I noticed my cooling fans where coming on a lot more often and working a lot harder than usual. However, I’ve never come close to over-heating. As long as your case has an intake and exhaust fan (one to suck cold air in, and another to pump the hot air out…standard on most cases now), you’ll be absolutely fine.
The one other thing that might be a problem (but this is more a case of me having a crappy motherboard), is that the heat sink and cooling fan on the card protrude quite a bit. On my motherboard, the fan hangs over the neighboring PCI slot, meaning that I can’t use it. In my case, there was nothing in there anyway, but some people could have problems.
Also, and this isn’t a bad point, but just something watch out for, is that this card requires a 300 watt (or greater) power supply. My power supply unit is exactly 300 watts, and the card ran without a hitch. I only mention this because this is the one thing many new users don’t think about. So before you buy, open your case and check how much power your PSU can put out. (Your PSU is the big thing with the fan that the power leads go into).
I only mention this because I read one of those awful “This card is crap, it doesn’t work, don’t buy it” user reviews, and that was the problem. He was trying to run this card on a rather wimpy 200 watt PSU. At that level it’s like trying to start a car with a double A battery.
In closing, I did a lot of research as well as testing it out personally, and this is quite simply the best video card you can buy for the money. If you go $50 cheaper, you’re going to get a card that’s not even half as good, and considering the next price point upwards is the $400-$600…you’re getting a deal.
(Now, for the none n00bs, here’s the technical specs):
450Mhz VPU
512MB of GDDR-2 Memory at 533MHz
Full DX9 compatibility
Shader Model 3.0
VGA, DV-I, TV-Out and HDTV
Max Resolution of 2560 x 1600 at 60hz, 1.07 Billion colors.
10 comments:
head spinning.
Great information, gonna hand it off to husband though.
Very Appreciated.
My gearhead wife won at the bingo, and ran off to buy this, after rebate, for CAN$170 plus tax...I was a little gunshy of the 'visiontek' line, but, I think she'll be just fine with it after this review. :)
Glad I can help. Visiontek make good products, and you save $$$ by not buying a direct from ATI card.
Great info... I just bought the Radeon X1300 XGE and the only thing I really noticed that is different is it says 450MHz + 150MHz Overclocked = 600 MHz
I am in IT for a living... Not a video card guru...
But was just curious if you have tried any of the 3dMark05 benchmarks on your card?
If so... What framerates are you getting on the free versions games...
Mine seem really low at 10-19 in the first demo... And am not sure if there is some serious tweaking I need to do...
Thanks
Just go to the ATI website and get the latest drivers.
On the other hand, for some reason 3dmark doesn't like the card very much. I got fairly low scores and framerates...but it'll run oblivion and Doom3 withoutmuch trouble. Oblivion has the occasional stutter, but is extremely playable, and Doom3 runs like butter, with everything as high as it will go.
just thought i would let you know. i also just got this card...the exact same one. but for 60$. yay for me!
I just bought this same card for $36.99
Well, whoopee shit!
Congrats, you managed to buy a three year old videocard for less than I did when it was brand new!
You must have amazing shopping skills.
Yeah, that review was written at the beginning of 2006. Congrats on getting it for thirty six dollars... I just hope you aren't planning on playing any new games on it.
Hey there!!! I would just like to know what type of power supply i should use for this card!!!??? Can you maybe tell me what the card works with??? 300w 400w 450w??? Or maybe bigger!!!! Please if you could maybe help me!!!
I've been running the x1300XGE on a 640W Antec Truepower Trio since 2007 with no troubles, and even managed to O.C. it a tiny bit, but I don't think it's worth doing. Visiontek pretty much maxed the little thing out.I've had no complaints about the card, but wouldn't recommend buying one now, unless you are a fan of old games.
Post a Comment