Wij maken custom watergecoolde PC voor U. We hebben nu al meer dan 20.000 PC's gebouwd voor onze klanten en ook voor de FBI waaronder ook bekende Youtubers.Hier kunt u zien hoe mensen reageren als ze onze computers binnen hebben gekregen. Ze zijn niet gratis natuurlijk.
Just wow. The quality is so good.I only paid $5.000
I Donald Trump
These guys build some amazing pc's
No one I ever met made my pc better than the one i own now.
They built pc's as good as i built walls!
Ik heb mijn eigen gamekanaal dus een goeie gampc is heel belangrijk
Ik ben supertevreden met de computer die deze jongens voor me gemaakt hebbe!
Na invullen van alle velden wordt de waarde berekend
Voer bedrag in
Bedrag inclusief BTW
Bedrag exclusief BTW
Na invullen van alle velden wordt de waarde berekend
Voer rentepercentage procenten in
Te betalen rente
Informatie over CPU's
1 / 3
2 / 3
3 / 3
De Core i processor serie is een merknaam voor de best presterende processoren voor consumenten van Intel. Dit geldt zowel voor processoren in desktop computers als voor mobiele processoren in draagbare computers. De Core i serie bestaat uit 4 submerken. De i3, i5 en de i7. In het high-end desktop platform heeft Intel ook nog de Intel Core i9 met de allerbeste prestaties.
Voor basis computergebruik is een Core i3 processor in het mainstream platform voldoende. Maak je veel gebruik van je computer en doe je aan gaming, foto- en of videobewerking? Dan is de keuze voor een Core i5 of Core i7 processor verstandig door de extra rekenkracht.
Ben je hier dagelijks mee bezig dan is keuze voor een snelle en dure Core i9 processor het overwegen zeker waard. Gebruik je de computer alleen voor internetten, tekstverwerking en het kijken/luisteren van muziek of video’s? Dan is een processor uit de langzamere Pentium of Celeron serie meer dan genoeg voor jou gebruik.
De processor wordt ook wel het hart van de computer genoemd. Deze hoofdchip is namelijk verantwoordelijk voor alle berekeningen en aansturingen die in een computer plaats vinden.
Hoe sneller een processor is, hoe meer berekeningen en aansturingen er per seconde kan verwerkt kan worden. Dus hoe meer taken de processor per seconde kan doen hoe sneller uw computer is.
De Core i3 processoren is de minst krachtige processor uit de Core i serie. Je kan deze Core i3 processor gerust kiezen als je geen veeleisende gebruiker bent.
De prestaties zijn namelijk prima en kan gezien worden als het standaard model voor mensen die normaal gebruik maken van de computer en geen bovengemiddelde prestaties verwachten van hun computer.
Je kan er zonder problemen met op internet en mails mee versturen. Ook taken die je op je werk of tijdens je studie moet doen in bijvoorbeeld Microsoft Office zijn geen enkel probleem. Het bekijken van series en films is ook voor een Core i3 processor mogelijk.
Door de ingebouwde grafische chip in de Core i3 processor ook in staat om basis games te spelen. Dit zijn dan vooral de oudere spelen op een lage resolutie. Ook simpele fotobewerking is met een Core i3 processor goed te doen.
Core i3 processor is geschikt om te gebruiken voor:
Intern & mailen
Werk & studie
Bekijken van films & series
Core i3 processor is minder geschikt voor:
Gaming (HD, 4K, & VR
De Core i5 procesor zit tussen de Core i3 en de Core i7 in qua prestaties. De Core i5 is zeer geschikt om op te gamen met een extra losse grafische kaart.
De Core i5 levert uitstekende prestaties voor een goede familie computer waar ook wel eens meer op wordt gedaan dan alleen kantoor- en schooltaken.
Mocht het budget het toelaten dan is veel gevallen een Core i5 ook de beste keuze. U kunt dan alle taken op uw computer uitvoeren zonder zorgen te maken over dat iets niet werkt.
Weet u zeker dat u bijvoorbeeld alleen maar gaat internetten, mailen en kantoortaken gaat uitvoeren? Dan is het zonde om deze duurde processor te nemen. De Pentium of Core i3 processor kan uw taken dan prima uitvoeren.
De Core i5 processor heeft geen Hyper-Threading maar wel Turbo Boost. Hierdoor kan de processor bij zware taken zichzelf tijdelijk overklokken.
Mijn advies is om het geld dat u bespaard uit te geven aan een SSD als opslagapparaat in plaats van een ‘trage’ harde schijf. De prestaties van een Core i3 met een SSD zijn stukken beter dan een Core i5 processor met een trage harde schijf.
Lees hier meer over opslagapparaten in het artikel: Opslagapparaat kiezen; HDD, SSD of SSHD?
Core i5 processor is geschikt om te gebruiken voor:
internet & mailen
Werk & studie
bekijken van films & series
Gaming (HD) in combinatie met krachtige grafische kaart
Core i5 processor is minder geschikt voor:
Gaming (4K & VR)
Verwacht je top prestaties van je computer omdat je er elke dag achter zit en je vele programma’s tegelijk gebruikt en geen zin hebt om ook maar even te wachten?
Dan is de Core i7 het topsegment van de processoren en is bedoeld voor de fanatieke computergebruikers die het uiterste vragen van hun computer. Uiteraard betaal je hier wel een hogere prijs voor.
De Core i7 processor heeft zowel Hyper-Threading als Turbo Boost. Ook het cachegeheugen is maximaal en het aantal processorkernen is vaak 4 of meer.
De Core i7 processoren zijn speciaal geselecteerd en zijn de best gelukte processoren die er zijn gemaakt. Alle Core i processoren worden namelijk op dezelfde manier gemaakt. En exemplaren die het niet goed werken worden als een Core i5, i3, Pentium of Celeron verkocht.
Door deze manier gooit Intel bijna geen processoren weg. Processor die niet goed zijn gelukt worden dan verkocht onder een andere naam met mindere prestaties.
Core i7 processor is geschikt om te gebruiken voor:
Internet & mailen
Werk & studie
Bekijken van films & series
Gaming (HD, 4K &VR)
Informatie over GPU's
Titan X (Pascal,2016)
RX Vega 64
1080p 144fps gaming this row and up
1080p 120fps gaming this row and up
GTX Titan X (Maxwell 2015)
GTX 980 Ti
R9 Fury X
4K 60 fps gaming this row and up
GTX 1060 6GB
RX 480 8GB
RX 480 4GB
GTX 1060 3GB
GTX 780 TI
GTX 1050 Ti
RX 560 4GB
GTX 660 Ti
RX 560 2GB
900p 60fps gaming this row and up
Playstation 4 GPU equivalent
Xbox One GPU equivalent
Intelgrated:Intel Iris Pro Graphics 6200
How are these graphics cards ranked?
These graphics card rankings are determined by performance in real-world PC gaming benchmarks. A benchmark is measured by the number of frames per second (FPS) a computer can achieve playing a certain game at a specified resolution and graphical settings.
Since most people purchase graphics cards for their gaming capability, we view real gameplay benchmarks ( as opposed to synthetic benchmarking programs) as the best way to measure the performance of a graphics card. Graphics card performance can vary from game to game, but this comparison table reflects the general rankings of each card.
Our favourite sources for gaming benchmarks include:|TechPowerUp/ TomsHardware,
/AnadTech,/TechSpot, and/many more
What is it?
(Graphics Processing Unit) A programmable logic chip (processor) specialized for display functions. The GPU renders images, animations and video for the computer's screen. GPUs are located on plug-in cards, in a chipset on the motherboard or in the same chip as the CPU (see diagram below). See logic chip.
A GPU performs parallel operations. Although it is used for 2D data as well as for zooming and panning the screen, a GPU is essential for smooth decoding and rendering of 3D animations and video. The more sophisticated the GPU, the higher the resolution and the faster and smoother the motion in games and movies. GPUs on stand-alone cards include their own memory (RAM), while GPUs in the chipset or CPU chip share main memory with the CPU.
Not Just Graphics Processing Since GPUs perform parallel operations on multiple sets of data, they are increasingly used as vector processors for non-graphics applications that require repetitive computations. For example, in 2010, a Chinese supercomputer achieved the record for top speed using more than seven thousand GPUs in addition to its CPUs (see GPGPU). See graphics pipeline and multi-GPU.
The evolution of the modern graphics processor begins with the introduction of the first 3D add-in cards in 1995, followed by the widespread adoption of the 32-bit operating systems and the affordable personal computer.
The graphics industry that existed before that largely consisted of a more prosaic 2D, non-PC architecture, with graphics boards better known by their chip’s alphanumeric naming conventions and their huge price tags. 3D gaming and virtualization PC graphics eventually coalesced from sources as diverse as arcade and console gaming, military, robotics and space simulators, as well as medical imaging.
The early days of 3D consumer graphics were a Wild West of competing ideas. From how to implement the hardware, to the use of different rendering techniques and their application and data interfaces, as well as the persistent naming hyperbole. The early graphics systems featured a fixed function pipeline (FFP), and an architecture following a very rigid processing path utilizing almost as many graphics APIs as there were 3D chip makers.
While 3D graphics turned a fairly dull PC industry into a light and magic show, they owe their existence to generations of innovative endeavour. Over the next few weeks (this is the first installment on a series of four articles) we'll be taking an extensive look at the history of the GPU, going from the early days of 3D consumer graphics, to the 3Dfx Voodoo game-changer, the industry's consolidation at the turn of the century, and today's modern GPGPU.
1976 - 1995: The Early Days of 3D Consumer Graphics
he first true 3D graphics started with early display controllers, known as video shifters and video address generators. They acted as a pass-through between the main processor and the display. The incoming data stream was converted into serial bitmapped video output such as luminance, color, as well as vertical and horizontal composite sync, which kept the line of pixels in a display generation and synchronized each successive line along with the blanking interval (the time between ending one scan line and starting the next).
A flurry of designs arrived in the latter half of the 1970s, laying the foundation for 3D graphics as we know them.
RCA’s “Pixie” video chip (CDP1861) in 1976, for instance, was capable of outputting a NTSC compatible video signal at 62x128 resolution, or 64x32 for the ill-fated RCA Studio II console.
The video chip was quickly followed a year later by the Television Interface Adapter (TIA) 1A, which was integrated into the Atari 2600 for generating the screen display, sound effects, and reading input controllers. Development of the TIA was led by Jay Miner, who also led the design of the custom chips for the Commodore Amiga computer later on.
In 1978, Motorola unveiled the MC6845 video address generator. This became the basis for the IBM PC’s Monochrome and Color Display Adapter (MDA/CDA) cards of 1981, and provided the same functionality for the Apple II. Motorola added the MC6847 video display generator later the same year, which made its way into a number of first generation personal computers, including the Tandy TRS-80.
A similar solution from Commodore’s MOS Tech subsidiary, the VIC, provided graphics output for 1980-83 vintage Commodore home computers.
In November the following year, LSI’s ANTIC (Alphanumeric Television Interface Controller) and CTIA/GTIA co-processor (Color or Graphics Television Interface Adaptor), debuted in the Atari 400. ANTIC processed 2D display instructions using direct memory access (DMA). Like most video co-processors, it could generate playfield graphics (background, title screens, scoring display), while the CTIA generated colors and moveable objects. Yamaha and Texas Instruments supplied similar IC’s to a variety of early home computer vendors.
The next steps in the graphics evolution were primarily in the professional fields.
Intel used their 82720 graphics chip as the basis for the $1000 iSBX 275 Video Graphics Controller Multimode Board. It was capable of displaying eight color data at a resolution of 256x256 (or monochrome at 512x512). Its 32KB of display memory was sufficient to draw lines, arcs, circles, rectangles and character bitmaps. The chip also had provision for zooming, screen partitioning and scrolling.
SGI quickly followed up with their IRIS Graphics for workstations -- a GR1.x graphics board with provision for separate add-in (daughter) boards for color options, geometry, Z-buffer and Overlay/Underlay.
Intel's $1000 iSBX 275 Video Graphics Controller Multimode Board was capable of displaying eight color data at a resolution of 256x256 (or monochrome at 512x512).
Industrial and military 3D virtualization was relatively well developed at the time. IBM, General Electric and Martin Marietta (who were to buy GE’s aerospace division in 1992), along with a slew of military contractors, technology institutes and NASA ran various projects that required the technology for military and space simulations. The Navy also developed a flight simulator using 3D virtualization from MIT’s Whirlwind computer in 1951.
Besides defence contractors there were companies that straddled military markets with professional graphics.
Evans & Sutherland – who were to provide professional graphics card series such as the Freedom and REALimage – also provided graphics for the CT5 flight simulator, a $20 million package driven by a DEC PDP-11 mainframe. Ivan Sutherland, the company’s co-founder, developed a computer program in 1961 called Sketchpad, which allowed drawing geometric shapes and displaying on a CRT in real-time using a light pen.
This was the progenitor of the modern Graphic User Interface (GUI).
In the less esoteric field of personal computing, Chips and Technologies’ 82C43x series of EGA (Extended Graphics Adapter), provided much needed competition to IBM’s adapters, and could be found installed in many PC/AT clones around 1985. The year was noteworthy for the Commodore Amiga as well, which shipped with the OCS chipset. The chipset comprised of three main component chips -- Agnus, Denise, and Paula -- which allowed a certain amount of graphics and audio calculation to be non-CPU dependent.
In August of 1985, three Hong Kong immigrants, Kwok Yuan Ho, Lee Lau and Benny Lau, formed Array Technology Inc in Canada. By the end of the year, the name had changed to ATI Technologies Inc
ATI got their first product out the following year, the OEM Color Emulation Card. It was used for outputting monochrome green, amber or white phosphor text against a black background to a TTL monitor via a 9-pin DE-9 connector. The card came equipped with a minimum of 16KB of memory and was responsible for a large percentage of ATI’s CAD$10 million in sales in the company’s first year of operation. This was largely done through a contract that supplied around 7000 chips a week to Commodore Computers.
ATI's Color Emulation Card came with a minimum 16KB of memory and was responsible for a large part of the company’s CAD$10 million in sales the first year of operation.
The advent of color monitors and the lack of a standard among the array of competitors ultimately led to the formation of the Video Electronics Standards Association (VESA), of which ATI was a founding member, along with NEC and six other graphics adapter manufacturers.
In 1987 ATI added the Graphics Solution Plus series to its product line for OEM’s, which used IBM’s PC/XT ISA 8-bit bus for Intel 8086/8088 based IBM PC’s. The chip supported MDA, CGA and EGA graphics modes via dip switches. It was basically a clone of the Plantronics Colorplus board, but with room for 64kb of memory. Paradise Systems’ PEGA1, 1a, and 2a (256kB) released in 1987 were Plantronics clones as well.
The EGA Wonder series 1 to 4 arrived in March for $399, featuring 256KB of DRAM as well as compatibility with CGA, EGA and MDA emulation with up to 640x350 and 16 colors. Extended EGA was available for the series 2,3 and 4.
Filling out the high end was the EGA Wonder 800 with 16-color VGA emulation and 800x600 resolution support, and the VGA Improved Performance (VIP) card, which was basically an EGA Wonder with a digital-to-analog (DAC) added to provide limited VGA compatibility. The latter cost $449 plus $99 for the Compaq expansion module.
ATI was far from being alone riding the wave of consumer appetite for personal computing.
Many new companies and products arrived that year.. Among them were Trident, SiS, Tamerack, Realtek, Oak Technology, LSI’s G-2 Inc., Hualon, Cornerstone Imaging and Winbond -- all formed in 1986-87. Meanwhile, companies such as AMD, Western Digital/Paradise Systems, Intergraph, Cirrus Logic, Texas Instruments, Gemini and Genoa, would produce their first graphics products during this timeframe.
ATI’s Wonder series continued to gain prodigious updates over the next few years.
In 1988, the Small Wonder Graphics Solution with game controller port and composite out options became available (for CGA and MDA emulation), as well as the EGA Wonder 480 and 800+ with Extended EGA and 16-bit VGA support, and also the VGA Wonder and Wonder 16 with added VGA and SVGA support.
A Wonder 16 was equipped with 256KB of memory retailed for $499, while a 512KB variant cost $699.
An updated VGA Wonder/Wonder 16 series arrived in 1989, including the reduced cost VGA Edge 16 (Wonder 1024 series). New features included a bus-Mouse port and support for the VESA Feature Connector. This was a gold-fingered connector similar to a shortened data bus slot connector, and it linked via a ribbon cable to another video controller to bypass a congested data bus.
The Wonder series updates continued to move apace in 1991. The Wonder XL card added VESA 32K color compatibility and a Sierra RAMDAC, which boosted maximum display resolution to 640x480 @ 72Hz or 800x600 @ 60Hz. Prices ranged through $249 (256KB), $349 (512KB), and $399 for the 1MB RAM option. A reduced cost version called the VGA Charger, based on the previous year’s Basic-16, was also made available.
ATI added a variation of the Wonder XL that incorporated a Creative Sound Blaster 1.5 chip on an extended PCB. Known as the VGA Stereo-F/X, it was capable of simulating stereo from Sound Blaster mono files at something approximating FM radio quality.
ATI Graphics Ultra ISA (Mach8 + VGA)
The Mach series launched with the Mach8 in May of that year. It sold as either a chip or board that allowed, via a programming interface (AI), the offloading of limited 2D drawing operations such as line-draw, color-fill and bitmap combination (Bit BLIT).
Graphics boards such as the ATI VGAWonder GT, offered a 2D + 3D option, combining the Mach8 with the graphics core (28800-2) of the VGA Wonder+ for its 3D duties. The Wonder and Mach8 pushed ATI through the CAD$100 million sales milestone for the year, largely on the back of Windows 3.0’s adoption and the increased 2D workloads that could be employed with it.
S3 Graphics was formed in early 1989 and produced its first 2D accelerator chip and a graphics card eighteen months later, the S3 911 (or 86C911). Key specs for the latter included 1MB of VRAM and 16-bit color support.
The S3 911 was superseded by the 924 that same year -- it was basically a revised 911 with 24-bit color -- and again updated the following year with the 928 which added 32-bit color, and the 801 and 805 accelerators. The 801 used an ISA interface, while the 805 used VLB. Between the 911’s introduction and the advent of the 3D accelerator, the market was flooded with 2D GUI designs based on S3’s original -- notably from Tseng labs, Cirrus Logic, Trident, IIT, ATI’s Mach32 and Matrox’s MAGIC RGB.
In January 1992, Silicon Graphics Inc (SGI) released OpenGL 1.0, a multi-platform vendor agnostic application programming interface (API) for both 2D and 3D graphics.
OpenGL evolved from SGI’s proprietary API, called the IRIS GL (Integrated Raster Imaging System Graphical Library). It was an initiative to keep non-graphical functionality from IRIS, and allow the API to run on non-SGI systems, as rival vendors were starting to loom on the horizon with their own proprietary APIs.
Microsoft was developing a rival API of their own called Direct3D and didn’t exactly break a sweat making sure OpenGL ran as well as it could under Windows.
Initially, OpenGL was aimed at the professional UNIX based markets, but with developer-friendly support for extension implementation it was quickly adopted for 3D gaming.
Microsoft was developing a rival API of their own called Direct3D and didn’t exactly break a sweat making sure OpenGL ran as well as it could under the new Windows operating systems.
Things came to a head a few years later when John Carmack of id Software, whose previously released Doom had revolutionised PC gaming, ported Quake to use OpenGL on Windows and openly criticised Direct3D.
Microsoft’s intransigence increased as they denied licensing of OpenGL’s Mini-Client Driver (MCD) on Windows 95, which would allow vendors to choose which features would have access to hardware acceleration. SGI replied by developing the Installable Client Driver (ICD), which not only provided the same ability, but did so even better since MCD covered rasterisation only and ICD added lighting and transform functionality (T&L).
During the rise of OpenGL, which initially gained traction in the workstation arena, Microsoft was busy eyeing the emerging gaming market with designs on their own proprietary API. They acquired RenderMorphics in February 1995, whose Reality Lab API was gaining traction with developers and became the core for Direct3D.
At about the same time, 3dfx’s Brian Hook was writing the Glide API that was to become the dominant API for gaming. This was in part due to Microsoft’s involvement with the Talisman project (a tile based rendering ecosystem), which diluted the resources intended for DirectX.
As D3D became widely available on the back of Windows adoption, proprietary APIs such as S3d (S3), Matrox Simple Interface, Creative Graphics Library, C Interface (ATI), SGL (PowerVR), NVLIB (Nvidia), RRedline (Rendition) and Glide, began to lose favor with developers.
It didn’t help matters that some of these proprietary APIs were allied with board manufacturers under increasing pressure to add to a rapidly expanding feature list. This included higher screen resolutions, increased color depth (from 16-bit to 24 and then 32), and image quality enhancements such as anti-aliasing. All of these features called for increased bandwidth, graphics efficiency and faster product cycles.
By 1993, market volatility had already forced a number of graphics companies to withdraw from the business, or to be absorbed by competitors.
The year 1993 ushered in a flurry of new graphics competitors, most notably Nvidia, founded in January of that year by Jen-Hsun Huang, Curtis Priem and Chris Malachowsky. Huang was previously the Director of Coreware at LSI while Priem and Malachowsky both came from Sun Microsystems where they had previously developed the SunSPARC-based GX graphics architecture.
Fellow newcomers Dynamic Pictures, ARK Logic, and Rendition joined Nvidia shortly thereafter.
Market volatility had already forced a number of graphics companies to withdraw from the business, or to be absorbed by competitors. Amongst them were Tamerack, Gemini Technology, Genoa Systems, Hualon, Headland Technology (bought by SPEA), Acer, Motorola and Acumos (bought by Cirrus Logic).
One company that was moving from strength to strength however was ATI. As a forerunner of the All-In-Wonder series, late November saw the announcement of ATI’s 68890 PC TV decoder chip which debuted inside the Video-It! card. The chip was able to capture video at 320x240 @ 15 fps, or 160x120 @ 30 fps, as well as
compress/decompress in real time thanks to the onboard Intel i750PD VCP (Video Compression Processor). It was also able to communicate with the graphics board via the data bus, thus negating the need for dongles or ports and ribbon cables. The Video-It! retailed for $399, while a lesser featured model named Video-Basic completed the line-up.
ATI belatedly introduced a 64-bit accelerator; the Mach64. The financial year had not been kind to ATI with a CAD$2.7 million loss as it slipped in the marketplace amid strong competition. Rival boards included the S3 Vision 968, which was picked up by many board vendors, and the Trio64 which picked up OEM contracts from Dell (Dimension XPS), Compaq (Presario 7170/7180), AT&T (Globalyst),HP (Vectra VE 4), and DEC (Venturis/Celebris).
Released in 1995, the Mach64 notched a number of notable firsts. It became the first graphics adapter to be available for PC and Mac computers in the form of the Xclaim ($450 and $650 depending on onboard memory), and, along with S3's Trio, offered full-motion video playback acceleration.
The Mach64 also ushered in ATI’s first pro graphics cards, the 3D Pro Turbo and 3D Pro Turbo+PC2TV, priced at a cool $599 for the 2MB option and $899 for the 4MB
The following month saw a technology start-up called 3DLabs rise onto the scene, born when DuPont’s Pixel graphics division bought the subsidiary from its parent company, along with the GLINT 300SX processor capable of OpenGL rendering, fragment processing and rasterisation. Due to their high price the company's cards were initially aimed at the professional market. The Fujitsu Sapphire2SX 4MB retailed for $1600-$2000, while an 8MB ELSA GLoria 8 was $2600-$2850. The 300SX, however, was intended for the gaming market.
S3 seemed to be everywhere at that time. The high-end OEM marked was dominated by the company's Trio64 chipsets that integrated DAC, a graphics controller, and clock synthesiser into a single chip.
The Gaming GLINT 300SX of 1995 featured a much-reduced 2MB of memory. It used 1MB for textures and Z-buffer and the other for frame buffer, but came with an option to increase the VRAM for Direct3D compatibility for another $50 over the $349 base price. The card failed to make headway in an already crowded marketplace, but 3DLabs was already working on a successor in the Permedia series.
S3 seemed to be everywhere at that time. The high-end OEM marked was dominated by the company's Trio64 chipsets that integrated DAC, a graphics controller, and clock synthesiser into a single chip. They also utilized a unified frame buffer and supported hardware video overlay (a dedicated portion of graphics memory for rendering video as the application requires). The Trio64 and its 32-bit memory bus sibling, the Trio32, were available as OEM units and standalone cards from vendors such as Diamond, ELSA, Sparkle, STB, Orchid, Hercules and Number Nine. Diamond Multimedia’s prices ranged from $169 for a ViRGE based card, to $569 for a Trio64+ based Diamond Stealth64 Video with 4MB of VRAM.
The mainstream end of the market also included offerings from Trident, a long time OEM supplier of no-frills 2D graphics adapters who had recently added the 9680 chip to its line-up. The chip boasted most of the features of the Trio64 and the boards were generally priced around the $170-200 mark. They offered acceptable 3D performance in that bracket, with good video playback capability.
Other newcomers in the mainstream market included Weitek’s Power Player 9130, and Alliance Semiconductor’s ProMotion 6410 (usually seen as the Alaris Matinee or FIS’s OptiViewPro). Both offered excellent scaling with CPU speed, while the latter combined the strong scaling engine with antiblocking circuitry to obtain smooth video playback, which was much better than in previous chips such as the ATI Mach64, Matrox MGA 2064W and S3 Vision968.
Nvidia launched their first graphics chip, the NV1, in May, and became the first commercial graphics processor capable of 3D rendering, video acceleration, and integrated GUI acceleration.
They partnered with ST Microelectronic to produce the chip on their 500nm process and the latter also promoted the STG2000 version of the chip. Although it was not a huge success, it did represent the first financial return for the company. Unfortunately for Nvidia, just as the first vendor boards started shipping (notably the Diamond Edge 3D) in September, Microsoft finalized and released DirectX 1.0
The D3D graphics API confirmed that it relied upon rendering triangular polygons, where the NV1 used quad texture mapping. Limited D3D compatibility was added via driver to wrap triangles as quadratic surfaces, but a lack of games tailored for the NV1doomed the card as a jack of all trades, master of none.
Most of the games were ported from the Sega Saturn. A 4MB NV1 with integrated Saturn ports (two per expansion bracket connected to the card via ribbon cable), retailed for around $450 in September 1995.
Microsoft’s late changes and launch of the DirectX SDK left board manufacturers unable to directly access hardware for digital video playback. This meant that virtually all discrete graphics cards had functionality issues in Windows 95. Drivers under Win 3.1 from a variety of companies were generally faultless by contrast.
ATI announced their first 3D accelerator chip, the 3D Rage (also known as the Mach 64 GT), in November 1995.
The first public demonstration of it came at the E3 video game conference held in Los Angeles in May the following year. The card itself became available a month later. The 3D Rage merged the 2D core of the Mach64 with 3D capability.
Late revisions to the DirectX specification meant that the 3D Rage had compatibility problems with many games that used the API -- mainly the lack of depth buffering. With an on-board 2MB EDO RAM frame buffer, 3D modality was limited to 640x480x16-bit or 400x300x32-bit. Attempting 32-bit color at 600x480 generally resulted in onscreen color corruption, and 2D resolution peaked at 1280x1024. If gaming performance was mediocre, the full screen MPEG playback ability at least went some way in balancing the feature set.
The performance race was over before it had started, with the 3Dfx Voodoo Graphics effectively annihilating all competition.
ATI reworked the chip, and in September the Rage II launched. It rectified the D3DX issues of the first chip in addition to adding MPEG2 playback support. Initial cards, however, still shipped with 2MB of memory, hampering performance and having issues with perspective/geometry transform, As the series was expanded to include the Rage II+DVD and 3D Xpression+, memory capacity options grew to 8MB.
While ATI was first to market with a 3D graphics solution, it didn’t take too long for other competitors with differing ideas of 3D implementation to arrive on the scene. Namely, 3dfx, Rendition, and VideoLogic
Screamer 2, released in 1996, running on Windows 95 with 3dfx Voodoo 1 graphics
In the race to release new products into the marketplace, 3Dfx Interactive won over Rendition and VideoLogic. The performance race, however, was over before it had started, with the 3Dfx Voodoo Graphics effectively annihilating all competition.
This article is the first installment on a series of four. If you enjoyed this, make sure to join us next week as we take a stroll down memory lane to the heyday of 3Dfx, Rendition, Matrox and young company called Nvidia.
AMD Radeon HD 7480D Graphics Card
ASRock FM2A68M-DG3+ motherboard
4GB DDR3 werkgeheugen RAM Memory
2Intel® Core™ i5-7400 3,0GHz Processor
MSI GeForce GTX 1060 GAMING 6G Graphics Card
MSI B250M PRO-VD Motherboard
Crucial 8GB DDR4 werkgeheugen RAM Memory
Intel Core i7-8700K Processor
Nvidia GeForce GTX 1080 Ti Graphics Card
AsusROG Strix Z370-E Motherboard
G.Skill Flare X 16GB DDR4-3200 RAM Memory
Als u een custom water cooling. Neem contact met ons op.