How we Test Gaming PC & Laptop Hardware

At this site, we put various technological advancements to the test, ensuring that our evaluations are dependable, repeatable, equitable, and most importantly, beneficial to our readers. As avid thegaminglaptop, it’s not just about a component’s speed or its flashy RGB lighting; we’re more concerned with how it will function in-game and how it will serve you. In addition, value plays a vital role in our testing process, whether it’s the latest graphics card, a shiny new streaming microphone, or a blazing-fast gaming monitor. Technology must make financial sense at the point of sale and on your computer screen.

However, this doesn’t imply that we’re only interested in budget-friendly options. Even if the best PC gaming technology is expensive, if it delivers something unique, it can still provide excellent value. When you’re shelling out the current high cost of PC hardware, you need to be confident that you’re making the right choice.

Our TheGamingLaptop hardware team has over 3 years of combined experience in hardware journalism, ensuring that we cover everything that’s crucial to you and to our hobby as a whole. We are committed to providing comprehensive coverage that caters to the needs of our readers.

What does TheGamingLaptop team review?

Due to the sheer number of hardware releases every month, we can’t test everything that hits the market. Therefore, we have to be selective about the technology we bring in for review.

We always make sure to test the latest graphics cards, covering the most significant releases on day one. However, we may not review every single processor from AMD and Intel. Instead, we focus on the flagship products and the CPUs that are most relevant to gamers. We keep an eye on post-launch buzz and affordability down the line, too. If a chip suddenly becomes more affordable and gains more interest, we’ll assess whether it’s worth your money.

Similarly, we evaluate new gaming laptops with the latest mobile technology inside them, but we can’t review every SKU on the shelves. We look for something new and interesting that sets them apart, whether it’s a new GPU, mobile chip architecture, or screen configuration.

The same goes for SSDs, gaming monitors, and peripherals. We don’t find it interesting to look at yet another Cherry MX Red mechanical keyboard unless it offers something new, like a lower price point or a new feature or technology.

The rapid pace of PC innovation means that prices and the competitive landscape are always changing. Occasionally, we update our tech reviews, which may mean changing the score. Our goal is to provide you with the most relevant and up-to-date hardware buying advice, no matter when you read a review.

We’re not suggesting updating a five-year-old monitor review every time Amazon reduces the sticker price by three percent. However, when a graphics card, SSD, or processor remains relevant or suddenly becomes more so, we may revise our reviews.

These are the most critical, price-sensitive, and performance-driven products in the PC ecosystem. They are also the products that we may recommend to you for years after their initial launch.

How do we Score Hardware?

Our approach to evaluating hardware at TheGamingLaptop utilizes a 100-point scoring system expressed as a percentage. We want to make sure our readers have a clear understanding of what each score represents, so here is a breakdown of how we use this system. It’s important to note that scores are not the review itself, but rather a convenient way to summarize the reviewer’s opinion.

We divide the score into categories: Low, Medium, and High

Low: 0 to 49%

Scores range from 0-09%, which represents hardware that is utterly broken or offensively bad, offering no value whatsoever. Moving up the scale, a score of 10-19% indicates that we may be able to find one positive aspect, but it still isn’t worth anyone’s time or money. A score of 20-29% suggests that the hardware completely falls short of its intended goals and offers very few redeeming qualities. A score of 30-39% indicates that the product is entirely clumsy or derivative in its efforts, while a score of 40-49% suggests that the hardware is flawed and disappointing.

Medium: 50 to 79%

When we award a score of 50-59%, it means that the product is mediocre at best, and other hardware likely does a better job, or the hardware’s unique qualities are not executed particularly well. A score of 60-69% suggests that there are some things to like about the hardware, but major caveats prevent us from giving it an unreserved recommendation. When we give a score of 70-79%, it means that the product is well-made and definitely worth a closer look.

High: 80 to 100%

An 80-89% score indicates that the hardware is a great piece of technology that deserves a place in your PC setup, while a score of 90-94% suggests that it is an outstanding product worthy of any gamer’s rig. At the top end of the scale, a score of 95-98% indicates that the hardware is absolutely brilliant, delivering unprecedented performance or featuring innovations that we have never seen before. Finally, a score of 99-100% represents hardware that is so groundbreaking and revolutionary that it advances the human species.

In addition to the score, our staff may award an Editor’s Choice Award at their discretion, which represents exceptional quality or innovation.

We hope that this breakdown of our scoring system has provided clarity and insight into how we evaluate hardware.

How do we test Graphics Cards

At our testing facility, we have a stationary setup which we update only when a new generation of gaming hardware emerges. Our purpose for this is to ensure that all graphics cards are tested using standardized benchmark numbers that we use for measuring comparative performance.

To achieve this, we employ a combination of both synthetic benchmarks and various games, all played at different resolutions, to evaluate how the graphics cards perform in the context of the latest titles available. This selection of games and benchmarks is updated in conjunction with the hardware inside the testing rig.

While raw frame rates and index scores are essential, we recognize that they alone do not paint a complete picture of a GPU’s capabilities. Therefore, we utilize the Nvidia PCAT hardware and FrameView software to examine factors such as temperature, power draw, and performance per Watt (at both 4K and 1080p) to provide a more comprehensive assessment of a graphics card’s performance.

Furthermore, we strive to explore the potential for overclocking the GPUs themselves in our daily testing routine. However, we do not aim to set world records in terms of GPU frequency that lasts only a split second before crashing the system. Instead, we aim to determine whether there is a feasible overclock that can genuinely enhance the graphics card’s performance.

How do we test CPUs

At our facility, we have dedicated systems for testing CPUs from both AMD and Intel. The purpose of these systems is to ensure that we can provide accurate and comparative performance metrics for these processors against their competition. To achieve this, we use the same GPU across both platforms when we’re comparing their numbers.

Our test rigs are equipped with the latest components and software to deliver reliable and consistent performance metrics.

The 12th Gen test rig features an Asus ROG Maximus Z690 Hero motherboard, Corsair Dominator RAM clocked at 5,200MHz, Nvidia GeForce RTX 3080 graphics card, 1TB WD Black SN850 PCIe 4.0 solid-state drive, Asus ROG Ryujin II 360 cooler, NZXT 850W power supply, DimasTech Mini V2 case, and Windows 11 operating system. Similarly, the 11th Gen test rig comprises an MSI MPG Z490 Carbon WiFi motherboard, Corsair Vengeance Pro RGB RAM clocked at 3,600MHz, Nvidia GeForce RTX 3080 graphics card, 1TB WD Black SN850 PCIe 4.0 solid-state drive, Asus ROG Ryujin II 360 cooler, NZXT 850W power supply, DimasTech Mini V2 case, and Windows 11 operating system.

We primarily use games and 3DMark Time Spy synthetic benchmark to measure relative gaming performance. Our game testing is carried out at 1080p resolution to ensure that the performance of the graphics card does not affect our relative frame rate results. We believe that at 1080p, games won’t be GPU-bound on high-end silicon used for CPU testing, and the performance difference between CPUs will mostly depend on the CPU and its supporting platform.

In addition to gaming tests, we use synthetic CPU productivity tests to measure rendering and encoding performance and memory bandwidth. We also use PCMark 10 to provide an overall system usage index score. Furthermore, we measure power consumption and temperatures using HWInfo software during peak and idle usage of the CPU.

We also perform some light overclocking to test the hardware’s capability and to see how much further it can be pushed in a home setup. We do not use liquid nitrogen but aim to find a level of overclocking that would be comfortable for users over an extended period and assess whether it makes a significant difference in performance.

How do we test PCs and Laptops

Our comprehensive testing of full computer systems combines both CPU and GPU methodologies to ensure that we test games at a level that showcases the benefits of both processors and graphics components. We evaluate memory performance, storage speed (both in terms of synthetic throughput numbers and gaming load times), power draw, and temperatures to provide a complete analysis of the system’s capabilities.

For laptops, we use Lagom LCD test screens and our own experience to determine the quality of the attached screens. In the mobile world, battery life is a critical factor. We test how long a notebook lasts while watching a movie or typing, but more importantly, we test how long it can last while playing intensive PC games. To assess this, we employ the PCMark 10 Gaming Battery Life Test benchmark.

However, our evaluation is not just based on benchmark tests. We also use the systems as if they were our own devices, living with them for a period to determine how they perform on a daily basis and how their individual components function. Personally, I prefer to write my reviews on the laptop being evaluated as it provides a better understanding of the standard usage.

How do we test SSDs

To accurately assess the performance of storage drives, be it internal or external SSD, it is essential to utilize a combination of real-world and synthetic benchmarks. Employing different software to measure varying levels of performance is also crucial. For instance, we rely on both ATTO and AS SSD benchmarks to evaluate peak throughput and performance with incompressible data. Additionally, we use CrystalDiskMark 7.0.0 to ensure the accuracy of our measurements.

Moreover, the PCMark 10 Storage benchmark provides an index score, along with bandwidth and access time data.

To complete our assessment, we measure the time required to copy an entire 30GB folder containing different file types, representing a small Steam game folder. Additionally, we use the game load times benchmark of Final Fantasy XIV.

How do we test Monitors

When it comes to evaluating gaming monitors, the most crucial factor is the quality of the panel and the technology used, including the backlighting system. While we rely on Lagom LCD screen testers to measure various aspects like black levels, white saturation, and contrast, our extensive experience in testing screens is also critical to assessing how a monitor feels to use.

In other words, we need to play games to truly evaluate a gaming monitor. It’s not an easy job, but it’s necessary.

Our testing process always begins with the default factory settings, and we test various features within the settings to measure potential blurring and ghosting, whether inverse or not.

While the panel’s quality is vital, the surrounding factors are also significant, although they are more subjective. Factors, like build quality, design, and feature sets, can still differentiate one monitor from another, even if they utilize the same panel.

How do we test Peripherals

When it comes to peripherals, the most important aspect is always going to be the individual’s subjective, personal experience with the device. Within our team, we each have our own preferences when it comes to keyboard switches. For example, I prefer a heavy switch with a light tactile bump, while Jacob prefers a certain membrane switch. Katie enjoys a dedicated clickity-clacker, and as for Alan… well, we don’t really talk about Alan’s preferences.

However, with decades of experience using hundreds of different keyboards, mice, and headsets, we know when something has been well-built, well-designed, and properly priced. Though Chris tends to focus solely on whether or not the peripherals will work in his BIOS.

When it comes to headsets, things get a bit more interesting. Each of us has a specific set of tracks and games we use to assess the performance of a set of drivers. While some of us even go as far as to measure a headset’s quality by how well it fits a dog (bespoke tests by Hope aside), we all follow similar testing methodologies. We consistently use the same key songs that we know well so we can hear the tonal differences between different headsets.

We also have a script for mic testing, allowing us to truly hear how certain microphones pick up different sounds and how well they represent our own voices. Trust us when we say that you haven’t heard a great microphone until you’ve heard Jorge’s out-of-context game quotes through it.

In conclusion, while individual preferences certainly play a role, our extensive experience and rigorous testing methodologies allow us to confidently recommend well-built, well-designed, and properly priced peripherals.