As an Amazon Associate this website earns from qualifying purchases.

Does a Computer Need a GPU? Yes…And Here’s Why!


GPU Graphics Card

This article is part of a beginners series on understanding computers.

Using a computer is an intensely visual experience. From the moment you turn it on you see an infinite array of colors and shapes all changing and rearranging in the blink of an eye. But how does your computer create these images?

How do 1s and 0s speeding through silicon and copper result in photos of your family vacation to Florida, spreadsheets for next week’s meeting, or the dragon you’re trying to slay? This is all thanks to the imaging workhorse of your computer, the Graphics Processing Unit or GPU.

Does a computer need a GPU? A computer requires some form of GPU to function. This can be integrated graphics that are included with the CPU or a separate, dedicated GPU. Dedicated GPUs provide higher performance but generally cost more.

Note: This article focuses on the principles behind graphics cards and how they process, interpret, and display graphical data. If you are trying to decide whether or not you need a dedicated graphics card for your computer, be sure to read Do You Really Need a Dedicated Graphics Card? (Use Cases).

Why We Use GPUs

Many early computers were enormous machines that took up entire rooms. In order to interact with them, we used paper punch cards with specific instructions and the computer printed the results. As computational complexity increased, so too did our need to interact quickly and efficiently with our machines.

Eventually, we switched from printouts to monitors so that we could see instant visual results of our calculations.

However, there was a problem. Computers have a Central Processing Unit (CPU). This is the brain of the computer, and it can complete a wide variety of tasks. As visual outputs became more complex, the CPU was dedicating more and more of its resources to rendering images, leaving less time for actual data processing.

To solve this problem, the GPU was introduced. The GPU is a specialized chip that handles the vast amount of repetitive calculations necessary for rendering images and playing video. Today’s computers delegate large but relatively simple jobs to the GPU, freeing the CPU to perform more complex tasks.

How Important is the GPU?

CPU, motherboard, GPU, and SSDs

It’s important to think of the graphics processor as one of a series of vital components for a computer system.

Let’s consider a 3D video game. How do we get from 1s and 0s to a single frame of a dragon trying to eat you? Your CPU gathers all of the binary data for the frame into a data packet. It then sends this to the GPU.

The GPU builds the frame in parts.

  • First it places all the vertices of the 3D objects.
  • Next it calculates the straight lines between them making primitive shapes. These shapes are mapped onto pixels and shaded with the correct colors.
  • Finally, it sends the completed frame to your monitor to be seen. This process can be repeated in excess of 60 times a second.

Obviously, this is an incredible amount of computation. The CPU is already handling all of the other functions of your computer. Giving it the extra graphical load would quickly overwhelm your system. And remember — the GPU’s function is not just limited to games. All of the images on your monitor are processed by a GPU.

GPU Jargon

Not every GPU is the same. In order to compare GPUs you need to know some basic vocabulary.

  1. Core – one unit in the GPU that can receive instructions and perform calculations. More cores mean more computational units to complete tasks faster. A GPU has hundreds or thousands of cores.
  2. Base/Boosted Clock Speed – the number of calculations per second the GPU can perform, measured in MegaHertz (MHz). Boosted Clock Speed is higher than normal Clock Speed activated when your computer is doing graphically intensive tasks, like playing video games.
  3. GPU RAM or VRAM – Video Random Access Memory is memory set up specifically for the GPU to use. The amount of 2D and 3D shapes your computer can produce as well as image load time and quality are directly related to the amount of VRAM you have. If you don’t have enough VRAM, your GPU will overflow and begin to use your computer’s RAM. System RAM is not set up specifically for GPU use and using it will cause massive losses in performance.
  4. Memory Bandwidth – the rate at which the GPU can read and write to its VRAM, measured in GB/s.

There are many other aspects of the GPU we could consider, but unless you’re building a supercomputer they aren’t especially relevant. These are the four most important. When considering them, have a holistic perspective.

For example, if your GPU has a 16GB of VRAM to access but its memory bandwidth is low, it won’t use that VRAM efficiently, leading to slow performance.

Under the Hood of a GPU

To render a single image the GPU needs to perform the same calculations over and over again. To do this the GPU is split up into hundreds or thousands of cores. This gives the GPU the computational resources it needs to quickly run the math and fill every pixel of your screen with the correct color.

To utilize the power of all these cores, they must be able to work on tasks in parallel or at the same time. To facilitate this, a GPU has its own programming architecture.

It used to be that GPUs were so specialized that they could only use this architecture to create images. But the two main GPU producers, AMD and NVIDIA, have created GPUs that support high-level programming languages like C++. This allows the many cores of a GPU to be used for more complex tasks.

This programming architecture mediates the interaction between the GPUs cores and its VRAM. While the cores are running, the GPU stores ongoing calculations and completed images in its VRAM.

The VRAM is connected to a Digital to Analogue Converter (DAC). When an image is completed, the DAC takes the digital data stored in VRAM and translates it to an analog signal to be sent via cable to the monitor.

Applications of GPUs

These days when people hear GPU they think of video games. While games are some of the most graphically intense computer applications, GPUs offer a wide range of benefits in other fields.

A good GPU is necessary for most engineers. Modern engineering relies on CAD or Computer-Aided Design software. CAD is used to create simulations to see if submarines will hold up under deep-sea pressures or if a new type of rover will make it to Mars.

These simulations have many variables to account for and thus require lots of computational power. A good GPU allows engineers all over the world to quickly and accurately model whatever project they’re working on.

Photo and video editing require intense graphical calculations. Anyone who uses Photoshop relies on a powerful GPU in order to make changes to media.

The fastest supercomputer in the world, Summit at the Oak Ridge National Laboratory, uses over 27,000 GPUs in order to simulate and solve some of the hardest physics problems of our generation (Source)

Even internet browsers are beginning to use GPUs. Modern browsers like Chrome and Firefox both use your GPU to render pages and stream video faster.

Conclusion

GPUs are critically important to how we use computers. Without a GPU you would have no images on your monitor, no way to interact with your computer.  Whether you need a dedicated graphics card or can rely on the integrated graphics power of your CPU, however, depends entirely on your needs and the demands that you’ll be putting on it.

Unsure of whether you need a dedicated graphics card or if you can get by with integrated graphics? Read the next article in this series: Do You Really Need a Dedicated Graphics Card? (Use Cases).

Paul

I'm just a dad who wanted to build something unique with his son. Together we built an awesome Desk PC - literally a computer built into a desk. I want to show you how to do the same thing!

Recent Posts