Reviews Archives - The Tech Report https://techreport.com/category/review/ Tech Explored Thu, 30 Mar 2023 13:21:11 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.2 https://techreport.com/wp-content/uploads/2023/06/cropped-techreport-logo-1-32x32.png Reviews Archives - The Tech Report https://techreport.com/category/review/ 32 32 OpenAI’s Chat GPT-4 Is The Best The Industry Has Seen So Far https://techreport.com/blog/openais-chat-gpt-4-is-the-best-the-industry-has-seen-so-far/ Wed, 15 Mar 2023 08:08:52 +0000 https://techreport.com/?p=3493490 OpenAI’s GPT-4 Is The Best The Industry Has Seen So Far

OpenAI, the company that rose to fame with its state-of-the-art AI-powered chatbot ChatGPT has once again surprised the industry with its latest power-packed model of an AI-based image and text-understanding...

The post OpenAI’s Chat GPT-4 Is The Best The Industry Has Seen So Far appeared first on The Tech Report.

]]>
OpenAI’s GPT-4 Is The Best The Industry Has Seen So Far

OpenAI, the company that rose to fame with its state-of-the-art AI-powered chatbot ChatGPT has once again surprised the industry with its latest power-packed model of an AI-based image and text-understanding model known as Chat GPT-4. Although it can take inputs in both textual and visual form, the output will be purely text-based.

The latest milestone in its effort in scaling up deep learning.Open AI

On a side note, the image input feature is yet to be launched publicly. For now, it’s only available to a single tester— Be My Eyes, an app developed to help visually impaired individuals with their day-to-day activities.

For now, GPT-4 is only available to the existing users of OpenAI under ChatGPT Plus.

OpenAI also confirmed that this latest version has done away with most of the bugs and shortcomings in the previous version. It’s intuitive, creative, and collaborative. In addition to that, GPT-4 is far more capable of solving complex problems than all the other previous versions.

Despite such tremendous developments, a few minor problems of the system remain the same. For instance, GPT-4 is still prone to making up false information. The company warns that the bot isn’t aware of any event that happened after September 2021. Any questions after this timeline can confuse the program. However, this minor hiccup doesn’t seem to have hindered GPT-4’s demand.

Pricing of Chat GPT 4

The company has decided to set a usage limit for these users until the program is launched full-fledged. The pricing so far seems quite reasonable. The company has decided to charge $0.03 per 1,000 prompt tokens, which translates to roughly 750 words. Whereas the completion tokens come at a price of $0.06 per 1,000 tokens.

Developers interested in getting API access will have to sign up for the waitlist.

The prime difference between prompt tokens and completion tokens is that the prompt tokens are the words that you enter into GPT-4, whereas the completion tokens are the words of the final content that the software generates for you.

It’s also important to note that tokens are raw texts. For example, if you want to enter the word “interesting”, it will be broken into its syllables which are “in-te-res-ting”. So for one word, you’ll need four tokens.

How is Chat GPT 4 better?

OpenAI further warns that regular users might not be able to notice the difference between GPT-4 and its predecessors. However, the difference has been clearly recorded during the testing phase.

For instance, this latest version is way more prepared to sit for exams like SAT Evidence-Based Reading & Writing exams, Uniform Bar Exam, LSAT, SAT Math, etc. In all these exams, GPT-4 managed to get a score of 88 percentile or more.

OpenAI has already partnered with several companies, such as Khan Academy, Duolingo, and Stripe, over their latest launch.

While for most of us, GPT-4 might seem like a new launch; it has actually been in the market for quite some time now. As Microsoft confirms, its latest launched BingChat, which has been developed in collaboration with OpenAI, is actually running on GPT-4.

The post OpenAI’s Chat GPT-4 Is The Best The Industry Has Seen So Far appeared first on The Tech Report.

]]>
Smart and Stylish: Watches That Can Do a Little More Than Tell Time https://techreport.com/review/smart-and-stylish-watches-that-can-do-a-little-more-han-tell-time/ Wed, 28 Sep 2022 15:00:17 +0000 https://techreport.com/?p=3479261 Smart and Stylish Watches

One of the most exciting developments of this century is the massive prevalence of innovative portable technology. Despite the relatively recent innovation of these smart technologies, they are now commonplace...

The post Smart and Stylish: Watches That Can Do a Little More Than Tell Time appeared first on The Tech Report.

]]>
Smart and Stylish Watches

One of the most exciting developments of this century is the massive prevalence of innovative portable technology. Despite the relatively recent innovation of these smart technologies, they are now commonplace in society. For example, cell phones have been replaced by smartphones, computers replaced by laptops and smart tablets, doorbells and lamps have received smart technology innovations, and pocket and wrist watches have made way for smartwatches.

Smart and Stylish — Here are our favorite smartwatches on the market in 2022.

Apple Watch Series 8

Our first smartwatch feature comes from the global tech giant, Apple. The Apple Watch Series 8 is the newest edition of one of the most popular smartwatches worldwide. Starting at $399, the smartwatch from Apple is one of the most high-end watches on the market today.

The Series 8 has all the same features as previous models, including; heart rate and health monitoring technologies, iMessage and apple pay functionality, GPS tracking and navigation, LTE browsing capabilities, and more.

Along with these functions, the new Series 8 also comes with a larger screen and crash detection system. Furthermore, the new Series 8 now features a low battery mode that helps to massively extend battery life. New health systems also allow Series 8 smartwatches to monitor and track fertility cycles, providing users with crucial information. With a wide array of features and a durable design, the Series 8 is an excellent option for anyone.

Apple Watch Series 7

I quickly want to touch on the Series 7 Apple Watch before moving forward. This is a great alternative option for those who want a high-end smartwatch at a more affordable price. With the release of the Series 8, the older edition models currently sell for around $150 cheaper than their new counterparts. In addition, the Series 7 comes with nearly all the same features as the Series 8 and still functions very effectively. So for those looking for an Apple smartwatch who don’t need the extra bells and whistles, the Series 7 is a great option.

 FitBit Sense

Next on our list of stylish smartwatches is the FitBit Sense. FitBit has made a name for itself as one of the top brands in the smartwatch and fitness industry, and its newest model, the FitBit Sense, continues to impress. The FitBit sense features a sleek design and a battery life of nearly six days.

 It may lack some of the iPhone compatibility features of the Apple Watch series, but it features top-of-the-line health and wellness technologies. For example, the FitBit sense features ECG readings, blood oxygen levels, skin temperatures, and more.

All these serve to make the FitBit Sense one of the top fitness smartwatches on the market. Furthermore, the FitBit sense starts at just $249.99, making it an affordable option compared to the new Apple Series 8.

FitBit Versa 2

Another more affordable counterpart to the FitBit Sense is the FitBit Versa 2. Starting at just $149, the FitBit Versa is one of the most affordable smartwatches on the market. Although they have since released the Versa editions 3 and 4, the FitBit Versa 2 still serves as one of the best value options on the market.

Customers can expect a lightweight and durable design, and the Versa 2 features an always-on display. In addition, the Versa 2 has high-quality fitness monitoring and offers alexa services. While it doesn’t feature GPS services, the Versa 2 is still a quality option for those looking for a smartwatch on a budget. 

Samsung Galaxy Watch 5

Our next smartwatch to look into is the Samsung Galaxy Watch 5. Samsung’s counterpart to the Apple Watch Series is perfect for Android users. With a price point starting at $279, or as low as $114.99 with trade-in, the Galaxy Watch 5 isn’t going to break the bank. The smartwatch features a 40-44 mm scratch-resistant screen and a 50-hour battery life. In addition, the design is lightweight, sporty, and sleek, with a few different color options.

Similarly to the Apple Series watches, the Galaxy Watch 5 comes with Samsung pay compatibility and access to an app store. In addition, the watch comes with a built-in skin temperature sensor, google assistant capabilities, and many more Android and Samsung-specific features. So for any Android smartphone user looking for a smart and stylish watch, check out the Samsung Galaxy Watch 5.

Amazfit Bip S

The final smartwatch we will look at today is the Amazfit Bip S. With a starting point of $39.99, the Amazfit Bip S is the best cheap smartwatch on the market. It does not include many of the excess features that earlier mentioned smartwatches have, but it is effective nevertheless.

The Bip S can play music, provide GPS services, monitor vitals, check the weather, and more. It comes in various colors and has around 24 hours of battery life. For those looking for a genuinely affordable smartwatch option, we highly recommend the Amazfit Bip S.

The post Smart and Stylish: Watches That Can Do a Little More Than Tell Time appeared first on The Tech Report.

]]>
Indices Vs. Forex Trading: How Are They Different and Which Is Better? https://techreport.com/blog/indices-vs-forex-trading-how-are-they-different-and-which-is-better/ Wed, 14 Sep 2022 15:12:50 +0000 https://techreport.com/?p=3478775 indices vs forex trading

If there is one thing to be grateful for in this current age and times, it is the fact that there are opportunities to make money everywhere you look. Some...

The post Indices Vs. Forex Trading: How Are They Different and Which Is Better? appeared first on The Tech Report.

]]>
indices vs forex trading

If there is one thing to be grateful for in this current age and times, it is the fact that there are opportunities to make money everywhere you look. Some of the best opportunities lie in trading financial assets and commodities. With so many kinds of trading in existence, such as crypto trading and stock trading, two things you would want to keep a close eye on are how to trade forex and indices. They bear similarities and differences, which you should know as a prospective trader, as well as their very own pros and cons; all of which this piece would help you explore in concise detail.

What is Index Trading?

It is a very popular practice to speculate on the values of stocks and place trades accordingly to make a profit. Index trading or indices, however, takes it a step (or several, in fact) further. It involves trading, not one, but several shares simultaneously, sometimes going over a hundred different stocks. Indices in themselves are groups of shares characterized by one or more parameters, such as location or specialization. These groups are assigned a collective value and are listed to be publicly traded by investors. Thus, index trading involves trying to predict the changes to the values of indices and seeking to make profits from price movements by going long (predicting an upwards price movement) or going short (predicting a downwards price movement).

This means you do not get to expect to take possession of any assets or stocks in index trading, unlike other kinds of trading such as how to trade forex. It is a derivative market. This means the “asset” you get to trade is merely a variation that derives its value from the real underlying asset. Indices may derive their values from the collective market capitalization of their component institutions or the aggregate share price of the institutions. Thus, the companies with the bigger market capitalization or share value would influence the index price even more. Traders can monitor factors such as company financial turnovers, economic conditions, mergers, and market performance; these are capable of affecting a company and index’s price.

What is Forex Trading?

Forex trading is a whole different ball game. While index trading is a spin-off of stock trading, forex trading is an entirely separate trading class. Forex trading involves exchanging a particular currency for its equivalent in another currency. It is a means of foreign exchange for countries, banking and financial institutions, as well as retail traders and investors. However, the market in question is the virtual aspect. This is predominantly dominated by banks that trade forex on behalf of their customers. It also allows investors to predict the price movements of currencies against other currencies and eke out profit-making opportunities. This works by trading the market pair of both currencies, with the main currency coming first in the pair.

For instance, wanting to trade USD against the Euro means opting for a USD/EUR pair. Forex trading comes in various forms, such as spot trading, futures trading, and forward trading. The spot market is where the classic x-to-y trades take place. Forward and futures trading are derivatives. This means you do not deal with real currencies. Instead, agreements between two parties exist to exchange currencies in the future under certain terms.

Differences between Index Trading and Forex Trading

No doubt, index trading, and forex trading have a lot of similarities. For one, they both involve trading financial instruments. Additionally, index trading operates similarly to the derivatives forex market, i.e., forwards and futures trading. However, the aspect under the microscope here is not what brings them together but what sets them apart.

  • Available Markets: Indices trading is a purely derivative market. This means it does not have the option of buying the asset. Buying index stocks does not mean the trader has the stocks of the institutions in the index. However, forex trading offers the opportunity for spot trading, which allows traders to take possession of the currencies in question. In fact, the spot market is the most popularly traded and liquid form of forex trading, with even more people still learning how to trade forex.
  • Value Consideration: In index trading, traders typically have to consider only one value. This is the price of the index they are interested in trading. Forex trading, on the other hand, requires factoring in the price of the currency to be traded as well as the currency it is being traded against.
  • Volatility: Significant value movements are required to occasion changes in index prices. This makes it less volatile and more suitable for multi-day traders, such as swing traders. Forex trading, conversely, experiences a lot more volatility due to the daily economic changes around each currency. That makes it more suitable for in-day traders such as scalp and day traders.

Which is Better?

It is a tall order to expressly say one is better than the other. The truth is each has its own pros and cons. All of which suit different traders depending on their peculiar preferences. For instance, the slow pace and relatively low level of difficulty that comes with index trading may suit beginners and multi-day traders more. The pacey and riskier nature of forex trading potentially proves more profitable for well-versed and experienced traders in how to trade forex. Thus, it is dependent on your circumstances as long as you accept the fact that each comes with its risks and rewards!

The post Indices Vs. Forex Trading: How Are They Different and Which Is Better? appeared first on The Tech Report.

]]>
Construction Tech Review of 2022 https://techreport.com/innovation/construction-tech-review-of-2022/ Wed, 31 Aug 2022 16:26:39 +0000 https://techreport.com/?p=3478496 construction tech workers

As the construction industry continues to grow and thrive, many construction tech advances are being made to ensure stronger, sturdier, and better-quality builds. The industry has changed so much, especially...

The post Construction Tech Review of 2022 appeared first on The Tech Report.

]]>
construction tech workers

As the construction industry continues to grow and thrive, many construction tech advances are being made to ensure stronger, sturdier, and better-quality builds. The industry has changed so much, especially since the invention of computers. 

Today, a lot of construction projects utilize many different technologies from the inception of a project all the way through completion. Innovation has been a key in bringing construction careers to the forefront of employment opportunities, and this is just the beginning.

Here is a list of five major pieces of construction technology to watch for this year. 

Drone Use in Construction

Drones are commonly used for taking photos or creating beautiful videos from heights and angles not easily reached by the average photographer. They are often used for filming television and film scenes as well. So how and why are drones such a big deal in the construction industry? Drones help as an extra set of eyes for security, mapping, tracking inventory, and even assisting with building inspections. 

These tiny flying cameras are a huge help at all stages of a construction project. Many options on the market exist today for mapping and surveying. Drones are capable of flat mapping and 3D mapping. Not only can drones do all of the things mentioned above, but they are extremely cost-effective. Before drones, people had to use pilots to do the same work, which was expensive, time-consuming, and harmful to the environment. Drones save time and money and are only getting better and better. 

3D Printing in Construction

Like drones, printers don’t seem like they would be a game changer for the construction industry at first glance. However, being able to turn a pencil drawing or digital design into a tangible, 3D-printed creation is a huge advancement. 3D printers receive much use in the construction industry. So much so, that the average 3D printing systems were not enough, leading to the development and creation of 3D printers used specifically for the industry, called construction 3D printers.

Not only does a 3D printer create a miniature version of a large-scale project, allowing you to make any changes before building the real thing, but 3D printers can also highlight any red flags to save you time and money down the road. If a design has a flaw, the printer can eliminate it. 3D printing is not just for miniature models, though. In fact, it is a great alternative when trying to build in a location that is considered unsuitable, harsh, or dangerous for a human workforce. 

As our understanding of 3D printing expands in the next decade, we will also be able to save money and reduce waste by printing man-made materials like bricks and concrete using a printer.

Construction Robots and ExoSkeletons in Construction

The idea of employed robots may seem like something out of a sci-fi movie from the distant future, but it’s already happening today, especially in times of pandemics, social distancing, and labor shortages. There are several kinds of robots, from standard factory robots to fully autonomous creations. 

Standard factory robots are those that are built to understand and execute a single task repeatedly. There are also collaborative robots. Think of those like a golf caddy. They make it easier for human employees to do their job, usually by carrying their tools as needed. Lastly, the fully autonomous robots are the ones you’ve seen in movies. These robots are able to act like a person. They can select the correct tools and complete a task on their own.  

As for construction exoskeletons, also called exosuits, they’re more machines than robots, although it’s easy to see the confusion. Construction exoskeletons are machines a construction worker can wear. The machines have motorized joints to make repetitive motions much easier to complete. Squatting, kneeling, and bending are simple tasks. However, repeating them daily while working long days can be detrimental to the body, causing injury. 

Exoskeletons come in several forms. There are back supports, shoulder supports, and even chair supports. In short, these devices alleviate pain and pressure from the back, shoulders, knees, and hips. There are also exosuits that can encompass the whole body, making it easier to lift heavy inventory items. 

Whether you’re using a robot or an exosuit, the physical workload for employees will be reduced. This keeps their bodies in strong, healthy conditions for much longer.

Construction Tech Review of 2022

There you have it — a review of 2022’s construction tech and all of the advancements that have made construction come back with a boom post-quarantine. Technology and construction don’t look like they go hand-in-hand at first glance. However, the two are constantly working together, making each industry stronger. The future of technology, especially in the construction industry, looks brighter than ever. 

The post Construction Tech Review of 2022 appeared first on The Tech Report.

]]>
Top Camping Lanterns https://techreport.com/review/top-camping-lanterns/ Mon, 29 Aug 2022 18:09:14 +0000 https://techreport.com/?p=3478547 Top Camping Lanterns

In a highly digital, increasingly online world, it can be more enjoyable than ever to just get away from all the noise sometimes and go camping. If you’re anything like...

The post Top Camping Lanterns appeared first on The Tech Report.

]]>
Top Camping Lanterns

In a highly digital, increasingly online world, it can be more enjoyable than ever to just get away from all the noise sometimes and go camping. If you’re anything like me, then you understand that the best part of camping comes at night, sitting around a campfire underneath the stars. But before we get too far ahead of ourselves and start fantasizing about a night outside staring at the sky, we first need to ensure we have a camping lantern to guide us in the dark. Thankfully modern tech means the top camping lanterns today are lightweight, durable, and affordable.

In this article, we will take a look at some of the top camping lanterns on the market. We hope to cover and report on a wide variety of products, suitable for any needs or budget. Regardless of your income, experience, or needs, we will find a great option for you. 

Luminaid Packlite Max 2-in-1 Power Lantern

Our first lantern on our list is a perfect option for any level of camper from novice to expert. So long as you are looking for a high-quality lantern at a reasonable price, the Luminaid Packlite is a great option for you. One of the biggest selling points of this lantern is its size. Weighing around a half pound and with the ability to fit inside a backpack, this lantern is great for maximizing storage space.

Don’t let the small size fool you though, this is a top-quality camping lantern. On its low power mode tests show that the Luminaid Packlite has around 50 hours of continuous battery. But even in the case that you run out of power, don’t fear because it comes with a built-in solar panel for automatic charging during the day. So whether you are a newbie or a master camper,  the lantern could be right for you.

Black Diamond ReMoji Lantern

Continuing our theme of small and easy-to-transport lanterns, we next focus on the Black Diamond ReMoji lantern. The Black Diamond Remoji is an extremely lightweight, handheld camping lantern currently selling for under $30. At its maximum brightness, the lantern can provide around 100 lumens of light, and it has a couple of different lighting features for users to choose from.

The lantern doesn’t have some of the fancier bells and whistles of other lanterns, but that isn’t necessarily a criticism. The Black Diamond ReMoji lantern is meant to be minimalist. It is lightweight, extremely small, and easy to use in any setting. The lantern recharges via a USB cable and can fit in your pocket. If you need a portable lantern to take with you on the go, then the Black Diamond ReMoji may be perfect for you.

Coleman Deluxe Propane Lantern

Shifting gears here, we now shift our focus to a more classic-style lantern. The Coleman Deluxe Propane lantern is arguably the top propane lantern on the market. Although it may seem outdated, gas-powered lanterns still have their practical uses. The main one is the ability to operate in colder temperatures than can quickly drain the power on battery-powered lanterns. The Coleman Deluxe Propane lantern provides high power light for up to 8 hours or light on low power for around 14. On its highest setting, the lantern can light up to 23 meters of the surrounding area.

With a sale point of $48.74 at the moment, it is another affordable option. For those campers venturing into more hostile environments, the Coleman Deluxe Propane lantern is a great option to have available. And even if you don’t camp and prefer to stay inside, it is always a good idea to have a backup option for lighting. Especially one that is gas-powered. 

Goal Zero Lighthouse 600 Lantern

Last on our list of top camping lanterns is the Goal Zero Lighthouse 600. The case could be made that this is the most versatile and well-rounded camping lantern on the market today. A price of $69.95 makes this the most expensive lantern on our list, albeit with good reason. The lantern can provide up to 600 lumens and can single-handedly light almost any campsite on its own. The lantern has numerous features pertaining to brightness and direction of light, all while being small enough to fit in almost any bag.

The Goal Zero Lighthouse 600 comes with a lightweight metal hanging for easier carrying or hanging on a hook. Furthermore, the lantern even comes with a USB station that users can access to charge other devices. All in all, it is hard to find a more versatile lantern option than the Goal Zero Lighthouse 600.

The post Top Camping Lanterns appeared first on The Tech Report.

]]>
Full Review of Logitech G933 Wireless Gaming Headset https://techreport.com/review/logitech-g933-wireless-gaming-headset/ Tue, 23 Aug 2022 19:16:03 +0000 https://techreport.com/?p=3478187 Full Review of Logitech G933 Wireless Gaming Headset

Whether you or someone you know is into PC, console, or even mobile gaming, it is highly likely that gaming is a part of your life in some way, shape,...

The post Full Review of Logitech G933 Wireless Gaming Headset appeared first on The Tech Report.

]]>
Full Review of Logitech G933 Wireless Gaming Headset

Whether you or someone you know is into PC, console, or even mobile gaming, it is highly likely that gaming is a part of your life in some way, shape, or form. Along with the growth of these different kinds of gaming, we are also witnessing the rise of online multiplayer gaming. This means that being able to communicate virtually is also a growing priority. Thus the rise of gaming headsets. We have reviewed other wireless Logitech headsets in the past. But now, we shift focus to a full review of the Logitech G933 wireless gaming headset.

Basic Overview – 8.2/10

The Logitech G933 wireless headset is a very well-rounded gaming headset. Wireless means not having to deal with pesky cords getting in the way or finding themselves in a tangle. Outside of its wireless design, the headset delivers on all the essentials you need out of your gaming headset. The design is passable, the mic quality is great, and the sound quality is very solid. And unlike the wireless Corsair headset, there are no concerns with material quality or product durability.  All in all, the Logitech G933 wireless gaming headset is a great versatile gaming option.

Design – 6.3/10

As with most higher-quality gaming headsets, the Logitech G933 Artemis Spectrum has somewhat of a bulky design. Two large square ear cups are connected by a wide headband that runs across the top. The headset comes with a two-tone black and gray design that while not breathtaking, is quite sleek. The large and spacious ear cups are relatively comfortable and provide the wearer with decent breathability. This means you shouldn’t run into any issues with sweating or overheating. While the ear cups are generally better than other gaming headsets we would still highly advise against usage in hot and humid environments. All in all, the Logitech G933’s design is solid but is not the main selling point of the headset.

Controls/Ease of Use – 7.5/10

The Logitech G933 wireless gaming headset has an intricate and in-depth, albeit confusing control system. Users have two different options for controls depending on whether they are using the headset wirelessly or not. If not, the controls can be found on the right ear cup. And when connected by wire the control can be found on the actual cord itself. Users can control a variety of sound and voice settings, as well as the basic play and pause features. The buttons have a nice design and are very durable. All in all, the control layout of the Logitech G933 is well-done and has no major weaknesses.

Sound – 8.1/10

The sound system for the Logitech G933 wireless is very high-end. The headset has very low latency and the sound quality is crisp and distinct. You will have no trouble locating enemy gunfire and making distinctions between sounds. Although it isn’t quite up to typical standard, one of the coolest features of the G933 is its surround sound system availability. Because of the lack of available space, it won’t be as high quality as some true stand-alone surround sound systems, but it is still an awesome feature for a gaming headset. If the surround sound quality was better this would easily be a true, top-of-the-line gaming headset sound system.

Microphone – 9.3/10

One of the biggest selling points of the Logitech G933 wireless headset is the microphone quality. Users have rarely if ever reported issues with mic quality or durability, and the voice pickup is crisp and clear. The high-quality microphone means that this headset is not only great for gaming and party chat but can also for crisp phone call communication and more. All in all, the Logitech G933 wireless gaming headset provides one of the top microphones on the market today.

Convenience – 5.8/10

Unfortunately for potential Logitech G933 buyers, the headset isn’t ideal for travel or portable use. The bulky size and lack of ability to fold/collapse the headphones mean that they can be quite difficult to store safely and comfortably for travel or portable usage. The headset also offers little in the way of noise cancellation meaning they aren’t ideal for usage on public transit like planes or buses. The mic is also retractable and as such cannot be detached which can cause issues with the durability of the microphone. While the headset is usable while traveling, it is not one of the top features of the headset.

Quality of Material – 7.7/10

While still featuring a mostly plastic composition, the Logitech G933 wireless gaming headset received some tweaks that make it much more durable than previous models from Logitech. The product design is aesthetically pleasing, and the parts are durable and dense, but still feel relatively lightweight. The headband is comfortable and has metal bands to reinforce the structure and support. Overall, the material quality of the Logitech G933 is very sound and solid. Users shouldn’t expect issues with durability.

The post Full Review of Logitech G933 Wireless Gaming Headset appeared first on The Tech Report.

]]>
Best Sony Bluetooth Speakers https://techreport.com/review/best-sony-bluetooth-speakers/ Fri, 19 Aug 2022 17:22:02 +0000 https://techreport.com/?p=3478424 sony bluetooth speaker

Sony has a reputation for giving users high-output speakers that don’t sacrifice volume for sound quality. If you are in the market for the best Bluetooth speakers that Sony has...

The post Best Sony Bluetooth Speakers appeared first on The Tech Report.

]]>
sony bluetooth speaker

Sony has a reputation for giving users high-output speakers that don’t sacrifice volume for sound quality. If you are in the market for the best Bluetooth speakers that Sony has to offer, we have compiled a variety of speakers that are sure to impress. Whether you need something portable and durable for a trip to the beach or you want some rich surround sound for your outdoor projector setup, Sony speakers have something that will meet your needs.

Sony SRS-XB13 Portable Bluetooth Wireless Speaker

The Sony SRS-XB13 wireless Bluetooth speaker is here to prove that good things come in small packages. At roughly 4 inches tall, it is one of the smallest Sony speakers. Don’t be fooled by its small size, though. The speaker sports an EXTRA BASS™ feature and a sound diffusion processor that allows the speaker to disperse your audio over a wider area. You can even pair a second speaker for a larger stereo setup. 

The 46mm driver produces a frequency range of 20Hz to 20kHz. It has some respectable bass, but the SRS-XB13’s primary focus is being portable and sturdy.

The SRS-XB13 boasts an IP67 rating, meaning you don’t have to worry about water or dust ruining your device. A full charge from the USB Type-C port will give you up to 16 hours of battery life. The Sony speaker also offers convenient hands-free calling with its built-in microphone.

You can find the SRS-XB13 for $49.99

Sony SRS-XE200

The XE200 is the XB13’s bigger, stronger older sibling. Like the XB13, the XE200 is a portable Bluetooth speaker that has an IP67 water and dustproof design (with additional shockproof capabilities not found in the XB13). It also features USB-C quick charging with a robust 24-hour battery life.

The X-series (for the XE200 and the slightly larger XE300) designs are made for superior sound and loud output. While a lot of wireless speakers have a hard time dispersing sound, the XE300 uses a Line-Shape Diffuser for a clearer and fuller output. The Line-Shape Diffuser really shines by distributing the front-facing sound consistently in every direction.

The XE200 can produce a more extended low-bass than the XB13 and is more customizable thanks to the graphic EQ featured in its companion app, so you can tweak its sound however you want. Compatibility with the Fiestable app offers other fun effects and additional customization.

You can find the XE200 for $129.99

SRS-XG500 Portable Bluetooth Wireless Speaker

For those wanting a more premium Bluetooth device with lots of bells and whistles, the Sony SRS-XG500 has a wide array of features. Sony’s X-Balanced speakers deliver high levels of sound, bass that packs a punch, and easy Bluetooth connectivity without sacrificing the speaker’s clarity.

At just over 12 pounds, the XG500’s lightweight build and carry handle make it easy to transport. The Sony speaker holds up to 30 hours of battery life and a 10 minute charge can power the XG500 for 3 hours. You can also charge your mobile device by plugging it into one of the speaker’s 2 USB ports. The outer mesh on the XG500 makes it resistant to water as well as dustproof. 

The speaker functionality is enhanced with the Fiestable and the Sony Music Center mobile apps. These apps allow you to pair up with other XG500s, create custom sound profiles, adjust the speaker’s LED lights, and much more. If you want to jam, you can plug a microphone or an instrument into the speaker’s auxiliary input.

You can find the XG500 for $448.99

LSPX-S3 Glass Sound Speaker

If you’re looking for a way to revamp the look of your living space while also adding crisp, 360º audio, look no further than the Sony LSPX-S3 Glass Sound speaker. The LSPX-S3 is both a portable Bluetooth speaker and a table lamp that resembles a candle. The LED light paired with clear sound can create ambiance in any space. The design may not be for everyone but if you feel your living space is in need of a minimalist statement piece, this may be a good option for you. 

The glass tube on the device isn’t only meant to help create a candlestick aesthetic. Sony also claims that there are three actuators in the speaker that attach to the end of the glass. These actuators then vibrate the entire glass tweeter “to spread sound in every direction.” 

The LSPX-S3 is a little more than 11 inches tall and weighs in at around 2.5 pounds. This additional weight in the base helps keep the device from unintentionally falling over.

You can find the LSPX-S3 for $348

Sony HT-S400 2.1ch Soundbar

If your TV/entertainment audio lacks depth and clarity, think about the Sony HT-S400 Soundbar and wireless subwoofer. This soundbar features 2.1ch front surround sound with X-Balanced speakers to produce crisp dialogue while the wireless subwoofer delivers more depth on the low end. Connect any Bluetooth-compatible device to stream music. 

This soundbar comes equipped with S-Force PRO Front Surround and Dolby Digital. This helps create cinematic-style surround sound without needing a full surround sound setup.

Also, the kit is easily adjusted by a small remote control and an OLED information display window on the soundbar. Other features include the utilization of Bluetooth A2DP, HDMI ARC as well as optical cable support. If you already have a Sony BRAVIA TV, you can connect the soundbar and subwoofer through Bluetooth for easy control and minimal cable clutter.

You can find the HT-S400 2.1ch Soundbar for $198

The post Best Sony Bluetooth Speakers appeared first on The Tech Report.

]]>
The Best Time Management Apps for Business https://techreport.com/innovation/the-best-time-management-apps-for-business/ Thu, 18 Aug 2022 16:20:07 +0000 https://techreport.com/?p=3478452 time management apps

As more folks move to work remotely or hybrid, it can become difficult to keep track of tasks and projects. It’s not so easy to pop over to a coworker’s...

The post The Best Time Management Apps for Business appeared first on The Tech Report.

]]>
time management apps

As more folks move to work remotely or hybrid, it can become difficult to keep track of tasks and projects. It’s not so easy to pop over to a coworker’s cubicle and ask about the status of a project, and it can be annoying to message a coworker about the status of a project over and over. Because teams are no longer working in close proximity to each other, there has to be a way to all be on the same page when it comes to time management for businesses. 

Queue time management applications. By signing your company up for a time management app, you can track everything from your desk. This makes it much easier to keep track of everyone and everything, from start to finish. Time management apps are a great way to make sure nothing is forgotten while staying on track to complete your tasks. 

Here are five of the best time management apps for businesses. 

Monday.com

Monday.com is relatively new to the time management and project management game, but it packs a punch. Many companies are moving their teams to the new platform as a way to track projects and get better at time management as a whole. Monday.com allows users to build their own task lists in a variety of layouts that fit users’ needs.

Once you add in your tasks, you can assign the time you think each task will take and the time the task actually takes. There is also a great stopwatch feature that will time your tasks for you. So, you can focus on your work without juggling too many things at once. Monday.com is user-friendly and has lots of wonderful features, whether you’re managing a team of two or two hundred. 

Notion

Notion is a popular app amongst college students and those new to the workforce because it is extremely user-friendly and can even be customized to your visual preference. This time and project management tool is a great way for businesses and individuals to manage their tasks and make sure they’re getting their deliverables done in a timely manner.

Notion is used by big companies, including Headspace, Curology, Loom, and MatchGroup. Using this app allows you to connect your teams, docs, and projects and build multiple workspaces for each team. Notion is visually stunning and works really well on a tablet for those who want a gentle transition from writing with pen and paper to digital task tracking.

Asana

Asana is a great way to manage your own tasks, those of a team, or even your entire company. Nobody wants to spend time emailing back and forth about the status of a task, tracking down info, and spending time finding a colleague to get a project update. Asana makes it easier to check in on your teammates. See what the status of each item is and update things in real-time.

A study of premium users showed that Asana made teams more efficient by 1.45 times, reduced status meetings and emails by 65%, and made it easier to get more work done. Time management apps like Asan help teams increase communication and accountability. Plus, they help reach their goals and deadlines in a more efficient way. Also, team members can even build out calendars or see their tasks and deadlines in a list form. This depends on the user’s preference. 

Trello

Trello is a great time management app. This app is a great visual tool that makes it easy to follow and manage all kinds of projects. Teams can track tasks, manage workflows, add files or checklists and even automate things as needed. Trello is visually pleasing because it puts all of the tasks in neat, clear columns that team members can comment on and tag each other in, allowing for direct conversation and messaging that will stick with the task, so the conversation doesn’t go missing somewhere in your inbox. This time management app is great for users who are looking for simplicity and user-friendliness.

Rescue Time

Rescue Time is a great time management tool for businesses. Recuse time helps teams use time wisely and efficiently to finish their task on time. For those easily distracted, Rescue Time is a great choice. Because it offers distraction-blocking tools to keep you on track and focused on the task at hand. This time management planner is user-friendly. It even sends out weekly reports to help you see where your time is going each week. The time-tracking feature is automatic, meaning no extra work for you or your team. 

The Best Time Management Apps for Businesses

There you have it. Five of the best time management applications on the market today. Whether you’re looking for a new way to track and manage your time as a business owner or you’re looking for a way to make yourself and your team more conscious of time spent on projects, the five apps above are a great place to start. 

The post The Best Time Management Apps for Business appeared first on The Tech Report.

]]>
The Best Camera Stabilizer to Buy https://techreport.com/review/the-best-camera-stabilizer-to-buy/ Tue, 16 Aug 2022 18:14:59 +0000 https://techreport.com/?p=3478439 The Best Camera Stabilizer to Buy

One of the biggest surges of the 21st century is photography and videography. With platforms like Instagram, Youtube, TikTok, and more growing larger everyday, it is easy to understand why...

The post The Best Camera Stabilizer to Buy appeared first on The Tech Report.

]]>
The Best Camera Stabilizer to Buy

One of the biggest surges of the 21st century is photography and videography. With platforms like Instagram, Youtube, TikTok, and more growing larger everyday, it is easy to understand why photography and videography are growing. Because people post so frequently to these online social platforms, camera skills are becoming more and more important. When you combine these newer forms of media with more traditional forms, it becomes easy to see why there is such a resurgence in camera and video tech. One of the greatest innovations to camera technology in recent memory is that of the camera stabilizer.

In reality, camera stabilizing technology first came about in the 1970s. Invented by Garrett Brown, the technology was initially put in use in order to increase mobility for very heavy cinematography cameras. Today however camera stabilizers serve a much broader audience. While cinema cameras have gotten significantly lighter and more practical, capturing steady and clear shots can still be problematic. Thankfully, as our cameras have progressed, our camera stabilizers have followed suit. In the remainder of this article we will take a look at a few of the best camera stabilizers you can buy.

How do Camera Stabilizers Work?

Like the name implies, camera stabilizers serve to eliminate twitches or bumps from a camera shoot. Primarily used for videography, a good camera stabilizer can make a world of difference in product quality. There are 3 different types of camera stabilizers that videographers can choose from:

Handheld

The most basic form of stabilizer, handheld camera stabilizers are as the name describes. The handheld stabilizer attaches to the camera and is then held by the videographer or photographer. Handheld stabilizers help to reduce shaking but because of their build, it still requires a steady hand to hold it for top quality.

Vest Stabilizer

The vest stabilizer system is attached to a vest worn by the videographer. Primarily put to use in high-budget cinematography, the vest stabilizer allows the wearer to walk around while still capturing a steady shot. Vest stabilizers systems consist of three parts; the physical vest, the arm that attaches camera to vest, and the sled which is responsible for actually balancing the camera itself.

3-Axis Gimbal

The final main type of camera stabilizer is the 3-axis gimbal system. The 3-axis refers to the 3 different planes of movement the camera stabilizer adjusts for. The 3-axis gimbal accounts for issues in camera hand steadiness or for issues in capturing moving shots. The systems come in both motorized and non-motorized models, with the motorized models requiring a charge to operate.

Now, here are a few of our favorite camera stabilizers you can buy today:

Glidecam HD-Pro Handheld Stabilizer

Our favorite handheld camera stabilizer on the market is the Glidecam HD-Pro. This professional grade handheld stabilizer consists of high-quality, highly durable stainless steel. The stabilizer can comfortably support up to 4.5 kg of weight which covers most cameras on the market. It also comes with a quick release camera plate to help make mounting your camera even easier. The Glidecam HD-Pro features a 3-axis system so users can ensure they get maximum camera stability. Because of the high-quality stainless steel, the stabilizer can be a bit heavy during long shoots. But all in all, it is one of the top handheld options available.

 DJI RSC-2

Next on our list is the DJI RSC-2 gimbal stabilizer. This stabilizer is perfect for capturing high-quality footage in a lightweight frame. Able to stabilize most any modern camera, the DJI RSC-2 also has the amazing ability to fold in on itself for easy mobile transport. The gimbal system is motorized but comes with 14-hour battery life so that you can shoot all day with no worries. Additionally, the stabilizer’s unique center column is the perfect tool for vertically shot social media videos. All in all the DJI RSC-2 is one of the most versatile products on the market today.

Steadicam M-2

Our final recommendation comes from one of the biggest names in the camera stabilizing industry, the Tiffen Steadicam M-2. The newest model from Tiffen is just as durable as previous models, but comes lighter weight and at a more affordable cost. “Volt” technology allows the camera to tilt, roll, and pan as necessary for a maximum stability camera shot. The “Volt” controls customizes and moves to either side of the camera to allow the operator maximum control.

Steadicam users can choose from a variety of lengths of carbon fiber posts, as well as choosing from a couple different options for the sled as well. The M-2 includes many configurations in a large number of different ways in order to best optimize the camera to the user’s needs. Overall, the Steadicam M-2 could be the best camera stabilizer on the market.

The post The Best Camera Stabilizer to Buy appeared first on The Tech Report.

]]>
Need a New Dish Drying Rack? Here Are Our Top 5 https://techreport.com/gadget-digest/need-a-new-dish-drying-rack-here-are-our-top-5/ Fri, 12 Aug 2022 16:35:34 +0000 https://techreport.com/?p=3478159 dish drying rack

Many Americans do dishes by hand to lower energy costs, save space in their kitchens, or save money on appliances as a renter. On the other hand, many people have...

The post Need a New Dish Drying Rack? Here Are Our Top 5 appeared first on The Tech Report.

]]>
dish drying rack

Many Americans do dishes by hand to lower energy costs, save space in their kitchens, or save money on appliances as a renter. On the other hand, many people have dishwashers in their homes, but the drying system isn’t great. And sometimes, we end up using multiple items that aren’t dishwasher safe, leaving us to handwash our kitchenware.  Those wet dishes can take up lots of counter space, making it difficult to cook your next meal or just creating an eyesore and the illusion of a messy kitchen. We’ve rounded up the best dish drying racks to combat these issues. 

Here are five of the top dish drying racks you can find on Amazon. (Links and pricing are accurate as of the date of publication.)

What to Look For?

When looking for a new dish drying rack, make sure you know exactly what dimensions your countertops are. There’s nothing more frustrating than finding the perfect item and waiting for it to be delivered, only to find it’s way too big or comically small. 

You’ll also want to decide whether or not you want it to stay in that spot or be compact enough to stow in a cabinet away when you need the extra counter space. Lastly, you want to make sure there is a good drainage system in place. Look out for a drain to avoid nasty, wet countertops. Better yet — look for a product with a movable drain. So, you can move your drying rack from one side of the sink to the other, depending on your needs.

Yamazaki Tosca 

If you’re looking for a dish rack that is cute but sturdy, the Yamazaki Tosca Dish Rack is just that. It comes in at 13.19×18.5×7.87 inches and weighs just over four pounds. This dish rack is not just beautiful. It’s also practical and functional, featuring wooden handles and a powder-coated metal steel frame. 

On top of its look and sturdy frame, the Yamazaki Tosca includes a spacious utensil holder to dry forks, knives, spoons, and other cooking accouterments.

The Yamazaki Tosca Dish Rack is available for $88 and comes in white or grey. 

Yamazaki Home 2876 Wire Dish Drainer Rack

If you liked the Yamazaki Tosca Dish Rack, but are in the market for something smaller, check out the Yamazaki Home 2876 Wire Dish Drainer Rack. This smaller dish rack is sturdy and made of alloy steel and ABS resin. 

The Home 2876 also features a drainer that takes excess water and drains it directly into the sink to prevent a wet, messy countertop. You can use this rack for dishes, cups, mugs, and silverware.

The Yamazaki Home 2876 Wire Dish Drainer Rack is available for $59 and comes in black or white. 

Cuisinart Aluminum Rust Proof Dish Drying Rack

The Cuisinart Aluminum Rust Proof Dish Rack is a compact rack that features a removable tray for easy wiping, a swivel drain spot for versatility, and a utensil caddy for customizability. This aluminum rack is rust-proof and comes in at 16x12x 5.5, making it perfect for an apartment. 

The dish drying rack is available for $42.99 and comes in black or silver. 

Stainless Steel 2-Tier Dish Rack

This two-tier drying rack comes with a knife holder, a utensil attachment, and a removable water tray to catch any excess dripping from your dishes as they dry. This rack is 16.7×10.5×10.2 and can hold up to 17 plates and 18 bowls. 

This rack is 16.7×10.5×10.2 and can hold up to 17 plates and 18 bowls. It is strong and sturdy and can hold up to 110 pounds. Foot covers keep this rack from slipping, even when it’s filled with dishes.

The Stainless Steel 2-Tier Dish Rack is available for $36.89

Romision 304 Dish Rack and Drainboard Set

The Romision 304 Dish Rack and Drainboard Set is a two-tier drying rack equipped with a moveable drain spout, a cup rack, a wine glass holder, and a utensil holder. It makes the perfect for those who enjoy hosting dinner parties. 

The Romision is 12.6×16.54×15.35 and can even be used to dry full-sized cookware. This design keeps the rack off of the countertop by an inch or so, thanks to four sleek, sturdy legs. The movable spout means you can place the rack on either side of the sink without worrying about making a mess. Indeed, this solid stainless steel rack is rust, oil, oxidation, and fingerprint-proof, making it a sleek accessory for any kitchen.

The Romision 304 Dish Rack is available for $69.99 but comes in a larger size for an additional charge.

Finding the Perfect One for Your Kitchen

Finding the right dish drying rack depends on your needs, the amount of space available in your kitchen, and the price point you’re looking for. So, keep your kitchen and dishes clean with one of the dish drying racks above.

The post Need a New Dish Drying Rack? Here Are Our Top 5 appeared first on The Tech Report.

]]>
Top Portable Green Screens of the Past Year https://techreport.com/review/top-portable-green-screens-of-the-past-year/ Thu, 11 Aug 2022 15:22:18 +0000 https://techreport.com/?p=3478198 Top Portable Green Screens of the Past Year

For many, the Covid-19 pandemic was a wake-up call. When students, employees, teachers, and management alike were unable to come into the office, people had to learn to adapt. Classroom...

The post Top Portable Green Screens of the Past Year appeared first on The Tech Report.

]]>
Top Portable Green Screens of the Past Year

For many, the Covid-19 pandemic was a wake-up call. When students, employees, teachers, and management alike were unable to come into the office, people had to learn to adapt. Classroom sessions and in-person meetings were replaced with Zoom calls. Standard 9-5 hours disappeared in favor of more flexible work from home schedules. And thanks to these pandemic changes, we are now seeing new technologies that lead to lasting impacts. One such technology, portable green screens, is changing the way that people think about work.

Digital Nomads

One of the largest benefactors of portable green screens is what people are now referring to as “digital nomads.” Put simply, a digital nomad is someone who works without the constraints of a physical office. Digital nomads can travel freely about and still stay connected to their work through the internet.

Previously, digital nomads were relatively limited by the constraints of their job. With recent changes brought upon by the pandemic, we are seeing a massive spike in digital nomads as people and employers alike take advantage of new technologies like portable green screens. Workers are able to work seamlessly while traveling, and can still partake in meetings thanks to applications like Zoom. Thanks to developments of these new technologies, like portable green screens, digital nomads can work remotely easier than ever before. 

What is a Portable Green Screen?

Before we go any further, we should first discuss what a portable green screen is. As the name implies, it is a green screen that is easily movable and convenient to quickly set up. With one, remote workers can ensure that they provide clear picture quality and resolution when participating in virtual meetings. Remote workers can freely travel to and work from a park or other venue without detracting from the effectiveness of the meeting. While portable green screens can be an asset to anyone looking to work remotely, others use them for something entirely different.

Some content creators are taking advantage of portable green screens to make some really interesting and funny content. One such content creator, TikToker Frankie Lapenna, has one of the most creative uses of a portable green screen. By no means is he the only content creator taking advantage of these, but he provides here just one example of the freedom they can provide. Whether you are a digital nomad, an aspiring content creator, or simply looking for a way to upgrade your remote work, a portable green screen may be right for you. These are our top ones of the past year.

Arozzi Green Screen

The Arozzi portable green screen is a great lower-budget option. It features a wide, rectangular screen, and a lightweight collapsible base. The green screen starts at $99.99 and is a great option for easy mobility. The base is a bit flimsy and can break when put under force, but all in all, this is a great option.

AFHT Portable Green Screen 

With pricing of under $70, the AFHT portable green screen is one of the best cheap options available. Upon purchase, users will receive the green screen, with one green side and one blue, as well as the stand. The screens are 5’x7’ and are easily foldable for easy transport. This is a great budget option.

Elgato Green Screen

Our next choice is one of the best options available for gaming content creators. If you are looking to become a gaming YouTuber or streamer, then it’s likely you’ve heard of Elgato. As one of the top names in the game capture industry, Elgato’s green screen is a great choice. With an aluminum case and an easy pop-up design, the Elgato green screen has some of the easiest set-ups. The product also features durable, high-quality materials that lead to it being a bit heavier. At $159.99, this is one of the top portable green screens on the market.

Neewer Collapsible Backdrop

Last but not least is the Neewer collapsible backdrop. Like the AFHT screen, this product can be found on Amazon and has a similar oval design. In most ways, the two products are very similar, but at $85.99 users receive a slightly higher material quality when choosing Neewer as compared to the AFHT backdrop. Each of these products is a great option, it’s up to you to choose the best option for yourself.

The post Top Portable Green Screens of the Past Year appeared first on The Tech Report.

]]>
Smart Lamps: What are They and Our Review https://techreport.com/review/smart-lamps-what-are-they-and-our-review/ Wed, 10 Aug 2022 14:52:43 +0000 https://techreport.com/?p=3478288 Smart Lamps What are they, and Our Review

One of the most incredible things about humanity and our history is that despite thousands of years on the planet, almost all technology we see today came about within the...

The post Smart Lamps: What are They and Our Review appeared first on The Tech Report.

]]>
Smart Lamps What are they, and Our Review

One of the most incredible things about humanity and our history is that despite thousands of years on the planet, almost all technology we see today came about within the last century or so.  While we can trace the origins of these new technologies back to ancient discoveries like fire or the wheel, technological development has seen an exponential increase in not just the past century, but even in the past 20 years. One such example of technological innovation is the technology in smart homes and smart lamps.

While we may not have the hoverboards or stylish silver shades featured in Back to the Future II, we have seen incredible advancements in smart technology. Particularly in regard to smart home technology. In this article, we will take a close look at smart lamps. We will help inform you on why they could be right for you and then evaluate some of our favorite options.

What are Smart Lamps?

Smart lighting, or smart lamps, are a form of lighting that works in connection with your smart home system. Users can control a variety of settings like brightness, color, and more from a connected smart device. Users also have the option to automate their lights for certain times of day or motion. In doing so users not only make their life more convenient, but they also become more efficient and effective at conserving energy usage. Homeowners also have the option to install shortcuts to certain light settings depending on which smart lamp they choose. 

How Do They Work?

Most smart lamps, as with most other smart home systems, function via an online hub. These online hubs function as a connecting point between your smart device, and your smart home systems. Many smart lamps and light systems connect to your home via your router, although this is not always the case.

Most smart lamp systems also have the ability to send feedback information to your device letting you know about potential errors or necessary replacements. While the technology in these bulbs is complex, they are extremely user-friendly and easy to install. No matter what your experience with smart technology is, smart lamps can be an easy addition to your home.

Our Top 3 Smart Lamps & Lights

Philips Hue White and Color Ambiance

One of our favorite smart bulb systems comes from the notorious Philips company. The main selling point of the Philips Hue White and Color Ambiance smart lights is the in-depth companion app. The application comes with numerous features for the time of day, color, automation, brightness, and more.

The web app is extremely user-friendly and the smart light features an easy connection with other home systems like Alexa. An extremely versatile and convenient smart lamp product, the only downside of the Philips smart bulb is its high cost. Despite the pricing, we feel that the Philips Hue and Color Ambiance is truly the top-of-the-line smart lamp product.

Wyze Bulb Color

The next smart bulb on our list is the Wyze Bulb Color. Contrary to our previous entry, the top selling point of the Wyze bulb is its low cost. Typically smart lamps and bulbs are very expensive, but the Wyze Bulb Color is perfect for those on a budget.

The Wyze bulb can connect with other home devices like Alexa and Google Assistant, but it doesn’t have as many features as some other smart bulbs. Users can adjust both color and brightness from the app, but don’t have many other options available. For those looking for a basic and cost-effective option, the Wyze Color Bulb is perfect. 

Sengled Smart LED with Motion Sensor

The final entry on our list of top smart lamp systems comes from Sengled, the self-proclaimed leaders in smart light technology. As with the previous options, the Sengled Smart LED comes with a variety of settings and can connect with other smart home systems. While the different settings are interesting, what truly sets this product apart is its high-efficiency motion sensor.

Users can customize the lights to their preference and the lights will operate whenever anyone walks within a set distance of them. Not only is this feature cool but it can also provide huge savings on energy bills. Users will have to pay upfront for this high-end smart bulb, but can then see huge savings down the road.

Final Thoughts

All in all, we hope this gives you a better understanding of smart lamps and helps you to put a smart bulb in your home.

The post Smart Lamps: What are They and Our Review appeared first on The Tech Report.

]]>
Top Shower Caps to Keep Your Hair Dry: Review https://techreport.com/review/top-shower-caps-to-keep-your-hair-dry-review/ Mon, 08 Aug 2022 17:30:23 +0000 https://techreport.com/?p=3478117 shower caps review

Don’t you hate it when you spend a bunch of time and money getting your hair done only to have it ruined the moment you step into a hot shower?...

The post Top Shower Caps to Keep Your Hair Dry: Review appeared first on The Tech Report.

]]>
shower caps review

Don’t you hate it when you spend a bunch of time and money getting your hair done only to have it ruined the moment you step into a hot shower? A shower cap is a great way to protect your hair from the heat, humidity, and water when taking a shower. 

They can help you fight the frizz, keep your hair intact, and even be used to keep your hair up and out of your face while doing your skincare routine.  

Here are five of the top shower caps on the market right now. (Links and pricing are accurate as of the date of publication.)

What Makes a Shower Cap Great?

When you’re shopping for a shower cap, you’ll want to make sure it checks a few boxes. Firstly, you want to choose one that will fit your head. If you’re shopping online and aren’t sure what size to choose, maybe try an adjustable cap. 

A cap that is too tight can be uncomfortable and cause headaches. It can even pull on your baby hair, leading to hair loss. A shower cap that is too loose will act as a sauna, keeping the heat inside and messing up your hair. 

You’ll also want to ensure you find one that fits all of your hair. If you have big, long hair, like dreadlocks or braids, you’ll want to make sure you find one that can hold all of your hair. You can also look for one lined with silk to further protect your hair.

Next, you’ll want to make sure the shower cap you choose isn’t porous and can get wet. If you choose a cap that can get wet, you’re more likely to have one that grows mold or mildew.

Kitsch Luxury 

The Kitsch Luxury Shower Cap is made of nylon and has an elastic band made to stretch and fit a wide range of head sizes and hair lengths. The elastic allows the user to tighten it as needed, allowing you to tuck any loose hairs before securing the cap.

This one is also waterproof, which helps protect your cap from growing mildew. This shower cap comes in lots of fun prints and works for all hair types.

The Kitsch Luxury shower cap is available for $21.59+ and comes in four design options. 

Drybar The Morning After 

The Drybar The Morning After shower cap comes from Drybar, the hair company known for giving the perfect blowout. This white and yellow cap is cotton-lined and holds all types of hair. It stays in place while remaining comfortable, keeping your blowout intact between washes. 

To wash your Drybar cap, just hand wash in cold water using a mild detergent. Follow the wash by airdrying it inside out.

The Drybar The Morning After shower cap is available for $16.

Klorane Washable Terrycloth-Lined 

The Klorane Washable Terrycloth Lined shower cap comes in a beautiful white floral design and is both eco-friendly and machine washable on the gentle cycle. 

This cap is reversible, making it the perfect option for those who want to do a hair treatment or a hair mask. To do this, simply flip the terrycloth fabric to the outside so the product doesn’t get all over the fabric. Using a shower cap when doing a hair treatment is a great way to retain heat and keep the product off of your neck, face, and clothes while you wait.

The Klorane Washable Terrycloth Lined shower cap is available for $16.20.

Annie Waterproof Braid 

The Annie Waterproof Braid shower cap is the perfect solution for those with locs, long braids, and even extensions. This promises to keep locs 100% dry. The Annie Waterproof Braid cap is more than 2 feet long and comes with a drying loop to make storing much easier.  

Along with being a great way to keep larger and longer hairstyles dry, the Annie Waterproof Braid is made to last for a long time.

The Annie Waterproof Braid shower cap is available for $6.99.

EcoTools

The EcoTools shower cap is a great way to keep your hair dry and up and out of your face. This product, along with the entire EcoTools brand, comes 100% vegan and cruelty-free, lined with organic cotton, and even includes a reusable travel case when on the go.

It comes in a pink tropical pattern, is breathable, and fits all head sizes.

The EcoTools shower cap is available for $9.81.

Finding the right cap can help you to protect your hair and save you lots of time when getting ready for the day. Protect your hair with one of the five shower caps above. For more reviews, see our review page.  

The post Top Shower Caps to Keep Your Hair Dry: Review appeared first on The Tech Report.

]]>
The Best Campfire Sprays of 2022 https://techreport.com/review/the-best-campfire-sprays-of-2022/ Thu, 04 Aug 2022 16:24:59 +0000 https://techreport.com/?p=3478239 The Best Campfire Sprays of 2022

In today’s world, more and more of our lives are being spent in the comfort of our homes. With remote work and social media becoming more and more prevalent every...

The post The Best Campfire Sprays of 2022 appeared first on The Tech Report.

]]>
The Best Campfire Sprays of 2022

In today’s world, more and more of our lives are being spent in the comfort of our homes. With remote work and social media becoming more and more prevalent every day, there becomes less and less reason for us to step outside. However, just because people are spending more time inside doesn’t mean that we have to forget about the outside world. Thanks to the innovative new product, campfire spray, you can bring the aroma of the wilderness anywhere you go.

Put very simply, campfire spray is a spray that can create the smell of a campfire in your home, car, or pretty much anywhere else. Kind of like one of those pine tree air fresheners for your car, except with the smell of a bonfire.

Best Campfire Sprays to Keep You Covered

In an increasingly oversaturated market, it can be tough to choose between all the different options. In this article, we will take a look at and rank some of the best campfire sprays available today. Whether you are a hardcore nature enthusiast, or just someone looking to bring a little touch of wilderness into your home, we can help you find the perfect campfire spray. Here are our top campfire sprays of 2022:

1. Candeo Campfire Smoke

Our favorite campfire spray of 2022 comes from the folks over at Candeo. With a 3.5 oz spray bottle retailing at $9.99, Candeo Campfire Spray won’t break the budget. Candeo Campfire Smoke is easy, quick to use, and will have any space smelling like campfire smoke in no time.

The container is made out of durable and sleek plastic material meaning it won’t break on you before its time. All in all, Candeo Campfire Smoke is one of our favorite sprays on the market right now.

2. Fire in the Hole Campfire Spray by Outlaw

Next up on our list is the Fire in the Hole Campfire Spray Cologne from Outlaw. According to Outlaw, Fire in the Hole is supposed to invoke all the scents of campfires, whiskey, and gunpowder into one bottle. And in our opinion, they succeed. If you consider yourself to be rugged and outdoorsy, then this is the spay for you.

The product retails a bit higher than our previous entry at $24.99 per bottle, but the quality is worth the cost. Outlaw also makes Fire in the Hole soap, body wash, and other products if you would like to purchase more. All in all, we think this product from Outlaw is a great high-quality buy. 

3. Manly Before You Go Toilet Spray (Campfire)

Our next product is a campfire toilet spray from the folks over at Manly. The company advertises that this product is for dudes who create horrible-smelling toilets while doing their business. Although we think this product is nifty for absolutely anyone whose prone to this. You walk into the bathroom, and before you sit down just give one quick spritz of the spray and the scents of a bathroom will swap for the scents of campfire smoke.

Manly Before You Go Toilet Spray retails at $14.99 a bottle on Amazon, and the company produces a number of other pre-bathroom scents if you’d prefer something else. We highly recommend any of the Manly toilet products.

4. Nature’s Oil Campfire Fragrance

The next product on our list is perfect for anyone who is into essential oils. Nature’s Oil Campfire is great for making soaps, air fresheners, shampoos, and other fun scented products.

Retailing at under $7, this product is easy to afford and of great quality. The oil has a fun woodsy/campfire smoke smell to it and it is entirely US based. Each bottle is 15ml and sealed with an easy-to-use drip cap allowing for safe storage and convenient usage. For anyone looking to practice with essential oils, this may be the perfect campfire spray product for you.

5. Yankee Candle Crisp Campfire Apples Concentrated Room Spray

The final spray on our list today comes from the candle and scent powerhouse, Yankee Candle Company. Yankee Candle promises that just two quick sprays from their bottle will change the odor in a room for around 4 hours. The crisp campfire apple scent derives from a combination of warm fruity apples and smoky campfire wood. Likewise, they combine for a feeling of warm comfort and nostalgia.

Each bottle is 1.5 oz and retails for around $8. Despite the small container size, the spray is so highly concentrated that each bottle should offer around 300 sprays. Customers also have the opportunity to buy in bulk with a 3-pack that is available on Amazon.

Final Thoughts

While there are many other options available for campfire sprays that we didn’t list here, we truly feel that these are the 5 best products on the market today. Whether you are looking for concentrated sprays, cologne, or essential oils, this list should have you covered. At the end of the day though, we recommend you do your own research and choose the product that works best for you. Hopefully, this article helps bring a little nature into your home.

Related Post: Top 7 Outdoor Projectors to Check out This Year

The post The Best Campfire Sprays of 2022 appeared first on The Tech Report.

]]>
Pillow Cube Review: The Cool Pillow You Need https://techreport.com/review/pillow-cube-review-the-cool-pillow-you-need/ Mon, 01 Aug 2022 18:38:47 +0000 https://techreport.com/?p=3478105 pillow cube

Did you know that there are more than 70 million Americans dealing with chronic sleep issues, which have been linked to an increased risk of things like depression, diabetes, high...

The post Pillow Cube Review: The Cool Pillow You Need appeared first on The Tech Report.

]]>
pillow cube

Did you know that there are more than 70 million Americans dealing with chronic sleep issues, which have been linked to an increased risk of things like depression, diabetes, high blood pressure, obesity, heart attack, and stroke? Many people with sleep issues are not using the right pillow at night. Whether it’s too firm or not firm enough, it’s important to find a pillow that works for you. Queue the Pillow Cube.

This compact, cube-shaped pillow is taking the Internet by storm. Designed for side-sleepers, the Pillow Cube is all over social media, and the reviews are coming in everywhere. Whether you’re used to the regular reviews on the website or prefer video reviews on TikTok or Youtube, there are thousands to explore. 

In fact, the Pillow Cube website currently has 280 five-star reviews, and the Amazon page has a four-star rating, with more than 1,300 reviews. So why do people love this sleep product so much? 

Why Do People Love It?

Pillow Cube users love this product for a myriad of reasons. This product is designed for side sleepers by side sleepers. The shape of this pillow isn’t arbitrary either. This pillow’s cube shape aligns your spine while supporting your head and neck all night. The pillow works by filling the shoulder gap between your body and the mattress.

Along with those great perks, the Pillow Cube boasts staying cool throughout the night as well. Its breathable cover helps keep the pillow cool, even for those who run hot. The temperature regulating core also helps keep it cool, allowing you to sleep soundly. Customers also rave about the pillowcase being smoother and softer than others. 

Because this pillow helps your body to sleep more comfortably, you are much more likely to fall asleep more quickly, stay asleep for a longer period of time, and therefore wake up feeling happier and more refreshed. 

Size Options and Specs

This pillow is available in three options: a four-inch 12×12, a five-inch 12×12, and a six-inch 12×12. It is important to choose the right pillow height when shopping for a cube. There is also a larger version, called the Pillow Cube Pro, that is 12×24 inches for those looking for a pillow that works better with the standard pillowcase shape. Many Pillow Cube users buy both the 12×24 and the 12×12 and use the smaller pillow for traveling.

The wrong pillow can stress out your entire body, causing many physical issues, from headaches and migraines to neck and back pain. It is recommended that if you fall between sizes, you go up a size. If you order the wrong size, you can always return the pillow within 30 days to get your money back.

Both the pillowcase and outer cover are machine washable to ensure your pillow stays clean and fresh for much longer. The soft yet supportive center boasts a cooling memory core made up of viscoelastic foam. This viscoelastic foam material molds to fit perfectly between your neck and shoulders. While it still gives you the sensation that your head comfortably holds in place as you sleep.

Is This Pillow Right For You? 

Another major Pillow Cube pro in the review section is customer service. Pillow Cube users are very happy with the quality of service they receive when interacting with the company, making exchanges and returns easy. 

Created by a side sleeper, this product may be right for you too. It is important to check in with yourself and parse out your sleep needs when shopping for a new pillow. If the Pillow Cube sounds like it could be the product for you, check out some of the many reviews left by users over the last few months. 

This pillow has been featured in the NYTimes, Tuck, New York, Mental Floss, and Apartment Therapy. If you are a side sleeper, searching for a solution to keep your neck aligned and your pillow cool, the Pillow Cube may be exactly the product to try. 

Review the Pillow Cube for Yourself

Many customers love the Pillow Cube. But to know if it’s the right pillow for you, you’ll have to give it a whirl for yourself. Having the right pillow for your body can make all the difference in the amount of rest you feel the next day. Getting a good night’s sleep can improve your mood and your mental clarity. Additionally, even those small, nagging aches and pains in your body. 

You can order the Pillow Cube on their website. It is also available for purchase on Amazon in a variety of size options. If you are not a side-sleeper yourself, the Pillow Cube will make an excellent gift for the uncomfortable side-sleeper in your life. Sweet dreams, side-sleepers.

Also see: E-WIN Champion Gaming Chair: Ergonomic Comfort Wins

The post Pillow Cube Review: The Cool Pillow You Need appeared first on The Tech Report.

]]>
Popular Software Products, According to Reviewers https://techreport.com/software-news/popular-software-products/ Thu, 28 Jul 2022 22:25:17 +0000 https://techreport.com/?p=3476808 Software development is a diverse industry. Today, we’ll take a look at the most popular software products, according to reviewers.

Software development trends are constantly evolving, with new software being released every day. However, some software is still dominant in 2022 and favored by users. With that in mind, today,...

The post Popular Software Products, According to Reviewers appeared first on The Tech Report.

]]>
Software development is a diverse industry. Today, we’ll take a look at the most popular software products, according to reviewers.

Software development trends are constantly evolving, with new software being released every day. However, some software is still dominant in 2022 and favored by users. With that in mind, today, we’ll take a look at the most popular software products, according to reviewers.

QuickBooks is the industry standard of accounting software, especially for small businesses. This popular software is a solid choice for various businesses, particularly those that work with accountants and bookkeeping.

Even though the software is user-friendly and frankly relatively easy to understand, it still has a learning curve when you are unfamiliar with accounting basics.

The software can run effectively in multiple departments, including payroll, bookkeeping, invoice management, expense tracking, bank reconciliation, tax management, financial reporting, etc. Additionally, QuickBooks features an intuitive mobile app compatible with iOS and Android devices.

2. Website Building Software: Wix

This might come as a shock to most people, especially die-hard fans of WordPress, but in 2018, Wix ranked up as the leading website builder worldwide by market share. This website builder’s revenue has continued to grow since then.

However, WordPress has a solid following online, with almost half of the world’s websites built on its platform. All in all, both industry giants are constantly improving their services to booth their traffic.

3. Customer Service Software: SalesForce

Customer relationship management is a crucial element in business, no matter the size or type of venture.

To that effect, SalesForce took this idea to the next level and provided its customers with undisputed CRM services. SalesForce brings companies and consumers together, making it a favorite among reviewers.

This CRM’s popular software offers customizable customer relationship management systems that will impact your sales positively and help them grow. The software allows your sales team to have real-time access to the customer info in one place with insightful reports and other functions on the dashboard.

4. Video Chat Software: Zoom

The pandemic favored Zoom because it became the de facto software for holding video chat conferences.

Despite social distancing requirements, most business owners took advantage of Zoom to continue conducting their business. Since then, Zoom’s extensive support and feature set has made it a popular software choice worldwide.

Whether you are using this software on your desktop or mobile, Zoom has something for you. Some of the features that made this app stand out were encryption, screen sharing, and live annotations, regardless of whether you pay for it.

5. Document Management Software: Google Drive

Google Drive is a tiered service software from Google that helps you store your files online and access them from the cloud.

Google Drive gives you the power to upload your content anywhere through the cloud. More importantly, you will get free web-based applications to create presentations, spreadsheets, and documents.

Today, Google Drive is one of the most popular software cloud-based storage. The files stored in this package can be accessed from any computer with an internet connection. Google Drive does away with the need to use thumb drives.

What’s more, Google Drive also allows you to share your files, making it easy to work with a team.

6. Real Estate’s Insurance Platform: Honeycomb

Landlords can now get an insurance quote in under 5 minutes with this new all-digital insurance platform. Honeycomb is reinventing the way landlords buy insurance for their properties.

Up until now, landlords have been frequently overcharged, with only a few options to choose from, the coverage is unreliable and the process has been slow and cumbersome.

Honeycomb now introduces the first 100% digital insurance platform for rental properties. They offer an all-digital solution with instant quotes, no third parties needed, and an AI-driven inspection process. This allows for a deeper, unbiased risk evaluation, everything in real-time.

Landlords don’t need to wait for 3 weeks for a quote. Now they can get a quote in under 5 minutes.

Software development is a diverse industry, driven mainly by consumer behaviors and other dynamic factors.

For your business to flourish in the modern world, you’ll need to stay updated with the latest trends in the software world, including the latest products.

The post Popular Software Products, According to Reviewers appeared first on The Tech Report.

]]>
Best, Most Innovative Inflatable Pools for Summer https://techreport.com/gadget-digest/best-most-innovative-inflatable-pools-for-summer/ Thu, 21 Jul 2022 19:18:59 +0000 https://techreport.com/?p=3478112 inflatable pool

It’s summertime and pool days are here. If you’re looking to take a cool dip in a pool but don’t want to break the bank by installing an in-ground pool...

The post Best, Most Innovative Inflatable Pools for Summer appeared first on The Tech Report.

]]>
inflatable pool

It’s summertime and pool days are here. If you’re looking to take a cool dip in a pool but don’t want to break the bank by installing an in-ground pool or deal with a semi-permanent above-ground pool, it may be time to invest in one of the inflatable pools you’ve been eyeing for the summer season

Inflatable pools are a simple, cost-friendly solution. Much more affordable than an in-ground or above-ground pool, inflatable options are great for lounging with friends as the summer temperatures climb.

Here are five of the most innovative inflatable pools on the market right now. (Links and pricing are accurate as of the date of publication.)

What to Know Before Buying an Inflatable Pool

When looking for the right inflatable pool, you want to consider a few things first. First, consider the size and ground area of your pool. Ask yourself: where will the pool go, is the surface smooth and flat, and will this surface support the weight of the pool when filled with water and people. 

When making sure you’re pool surface is large enough, it’s also a great time to check and make sure you have space for draining your pool. Some pools have a plug that will drain the pool when you’re ready. For others, however, you have to tilt the pool to pour the water out. Some of the pools listed below can hold hundreds of gallons of water at a time, so it’s important to make sure you have the space for that much water to be spilled when you’re done frolicking and relaxing. Keeping the water in the inflatable pool for a long period of time can lead to mosquitos and other bugs. 

Lastly, you’ll want to be sure to buy a good air pump to accompany your pool purchase. Inflating an entire pool by yourself, using just your lung capacity is exhausting and will take a very long time. 

1. Bestway Fast Ground Pool Set

First on a lot of pool lists, Bestway pools come in multiple sizes, ranging from 8×26 to 18×48 feet. This pool is very easy to set up. Just set it on the surface of choice and inflate the top ring. Once that is complete, you can fill the pool with water.

This above-ground pool potentially fits the whole family and likely lasts a long time. Made of lightweight PVC material, this Bestway pool comes with a 330-gallon filter pump and cartridge to keep your pool looking fresh and clean all summer long. Once pool season is over, just activate the built-in flow control drainage valve. The valve attaches to a standard garden hose.

You can find this pool in four sizes, ranging from $109 to $699.

2. Minnidip Pool: That’s Banana Leaves 

This Minnidip Pool is all over social media right now. It is much smaller than the Bestway pool above and is loved for its fun, summery print. This designer inflatable pool is pink and white and covered in banana leaves in shades of green.

Called the “adult kiddie pool,” the Minnidip is perfect for two or three people. This pool is 5 feet 5 inches wide and 18 inches tall. It’s easy to set this pool up using a standard air pump or a hair blow dryer.

You can find this pool for $69.99.

3. Bando Heart-Shaped Pool

Much like the Minnidip above, the Bando heart-shaped pool is incredibly cute and the perfect pool for cute pictures on your feed. The neon red pool is shaped like a heart and can fit two to three adults comfortably. 

This pool can hold twelve gallons of water, has four air valves, and includes a drain for emptying the pool when you’re finished. It also comes with patch repairs. Bando also encourages people to try using this small, heart-shaped pool as a giant cooler at their next pool party. 

You can find this pool for $85.

4. Members Mark Elegant Family Pool

If you’re looking for a larger pool with built-in seats and backrests, look no further. The Members Mark Elegant Family Pool is ten feet long and has two air-cushioned backrests and seats. This pool also boasts a two-in-one valve to make inflation and deflation much easier. 

For those with lots of space, this is a great inflatable pool option. This pool is great for families and is made of puncture-resistant material. But just in case, the box includes a repair patch with your purchase. 

You can find this pool for $65.88.

5. Intex Swim Center Family Lounge Inflatable Pool

The Intex pool is a great choice for lounging with friends. This pool comes with four inflatable seats and backrests and even has drink holders built into the pool wall.

The Intex Swim Center Family Lounge Pool is 7.5×7.5 feet and has a wall height of 26 inches. This pool also includes a repair patch.

You can find this pool for $71.25.

Buying the Best Inflatable Pool

There you have it — five of the best inflatable pools on the market right now. And last but not least — don’t forget the sunscreen.

The post Best, Most Innovative Inflatable Pools for Summer appeared first on The Tech Report.

]]>
Top 7 Outdoor and Camping Projectors to Check Out This Year  https://techreport.com/review/top-7-outdoor-projectors-to-check-out-this-year/ Wed, 20 Jul 2022 15:40:40 +0000 https://techreport.com/?p=3478216 outdoor projector movie seascape

Outdoor projects are great for an outdoor event or camping with friends and family. However, finding the right projector for you can be a hassle. So, here is our list...

The post Top 7 Outdoor and Camping Projectors to Check Out This Year  appeared first on The Tech Report.

]]>
outdoor projector movie seascape

Outdoor projects are great for an outdoor event or camping with friends and family. However, finding the right projector for you can be a hassle. So, here is our list of outdoor projectors to make your next event spectacular.

1. DI Tong Bluetooth Outdoor Projector 

This projector is great for outdoor use for three reasons. First, its wide range of screen dimensions makes it very versatile, allowing you to have as big or small of a screen as you need. Second, this projector has a cordless ability with a battery that does not need to be plugged in at all times. Last, it really is handheld and super mobile yet durable at the same time.

outdoor projectors 1

Features of this projector include:

Brightness: 2,000 Lumens

Light Source: LED

Optical Resolution: 1280×720

Projected Dimension: 21-300 inches 

Screen Scale: 16:09

Buy on AwesomeProjectors.com for $507.00

2. AUN Portable HD Projector 

The AUN Portable HD projector is great for camping trips or an outdoor date night or really anyone on the go. Indeed, you get all the quality of a big home cinema projector in a quarter of the size. This is a smart home projector as well with limitless entertainment options in its library for any occasion. Plus, it comes with a remote, as well as a power cable with any adapter you might need for your specific country. You can purchase this projector and find more info about it here.

Features of this projector include:outdoor projectors 2

Brightness: 6,000 Lumens

Light Source: LED

Optical Resolution: 1920x1080p

Projected Dimension: 50-130 in

Screen Scale: 4:3/16:9

Buy on AwesomeProjectors.com 

3. DI Tong Outdoor/Camping Smart Full HD Projector 

Next on our list, the DI Tong Outdoor Smart projector comes in a little pricier but has all of the gadgets. It has an 8-core processor and smart projector technology. Plus, it has Bluetooth, Full HD, epicenter autofocus, touch control, a large built-in battery to make it cordless, and 2.5/5g wifi. Certainly, with all of these features and a very durable design, it makes being in the outdoors even better. 

Features of this projector include:outdoor projectors 3

Light Source: LED

Optical Resolution: 1920×1080

Projected Dimensions: 20-300 in

Screen Scale: 16:09

Buy on AwesomeProjectors.com for $799.00

4. AUN Smart Mini Projector 

The AUN Smart Mini projector is a great bang for your buck. It has smart and wifi capabilities. Plus, you can sync to all of your devices. It also has a powerful internal battery to make it cordless and perfect for on the go. Additionally, the brightness comes in at 2,000 lumens when using the battery. But, it gets bumped up to 3,000 lumens when using the power adapter.

Features of this projector include: outdoor projectors 4

Light Source: LED

Brightness: 3,000 Lumens

Optical Resolution: 800×480

Projected Dimensions: 20-120 inches

Screen Scale: 16:09

Buy on AwesomeProjectors.com for $190. 

5. BOMAKER Full HD Outdoor Wifi Portable Projector 

This projector wins in terms of usability. It easily connects to all types of devices including iOS, Android, Mac, and Windows. It has no need for adapters or HDMI cables but does come with a bonus tripod. In addition, it can sync to your mobile device so what you see on your phone is what you see on the screen with zero time delay. This projector is great for gaming or for outdoor large screens. Likewise, it’s perfect for your family’s outdoor movie night! 

Features of this projector include:outdoor projectors 5

Light Source: LED

Brightness: 6500 Lumens 

Optical Resolution: 1920x1080p

Projected Dimensions: 30-300 inches

Screen Scale: 4:3/16:9

Buy on AwesomeProjectors.com 

6. EUG Portable Outdoor/Home Theater Projector 

The EUG Portable Outdoor Projector brings a whole new meaning to portable. It literally fits in the palm of your hand. So, you can bring it anywhere. Indeed, with wifi capabilities, it can easily connect to any of your devices. It also has a very robust battery for its size allowing you to use it for up to 3 hours on one charge. Additionally, the shape of this projector adds to its durability and ability to be easily stored. 

Features of this projector include: home theater 6

Light Source: LED

Brightness: 6000 Lumens 

Optical Resolution: 1280x800dpi

Projected Dimensions: 60-150 Inches

Screen Scale: 4:3/16:9

You can find this projector and more info at Awesome Projectors.

7. Thundeal Full HD Outdoor/Portable Projector 

The Thundeal Full HD Portable Projector is very solid all around, but what makes it stand out is the speaker quality. Its built-in speakers are the best on the list and work great both outside and inside. Indeed, it easily connects to all of your devices except for Android which is what keeps this from being at the top of the list. If you do not have an Android however this should really be at the top of your projector considerations. 

Features of  this projector include: projecting 7 cables

Light Source: LED

Brightness: 3200 Lumens 

Optical Resolution: 1280×720

Projected Dimensions: 30-200 inches

Screen Scale: 4:3/16:9

You can learn more about and purchase this projector at Awesome Projectors.

The post Top 7 Outdoor and Camping Projectors to Check Out This Year  appeared first on The Tech Report.

]]>
Top 7 Online Wallets for Storing Crypto https://techreport.com/review/online-wallets-for-storing-crypto/ Tue, 12 Jul 2022 18:10:04 +0000 https://techreport.com/?p=3478089 Top 7 Online Wallets for Storing Crypto

It can sometimes be tough to keep up with all of the changes happening in the world. Between our own personal lives and the global crises that are seemingly becoming...

The post Top 7 Online Wallets for Storing Crypto appeared first on The Tech Report.

]]>
Top 7 Online Wallets for Storing Crypto

It can sometimes be tough to keep up with all of the changes happening in the world. Between our own personal lives and the global crises that are seemingly becoming daily headlines, it’s not always easy to stay up on current events. Not to mention it feels like new technologies are popping up on social media or in storefronts every day. One of these technologies is cryptocurrency. Crypto is one of the biggest investing and technological trends of the 21st century. With crypto seeming like a long-term fixture, the time to join in on the crypto game is quickly approaching. In this review, we will talk about some of the top online wallets for storing crypto.

What’s an Online Wallet?

An online wallet, or cryptocurrency wallet, is essentially just a wallet where users can securely store their crypto investments online. Like a real-world wallet, crypto traders don’t necessarily need an online wallet. You can still spend or trade your online assets as you please. However, having an online wallet helps you to both organize and safely store your cryptocurrency. While it is not a technical necessity, we would recommend that anyone looking to invest in crypto also invest in an online wallet. 

Top 7 Online Wallets

Exodus

Our first entry on the list, Exodus, is a great online wallet for crypto beginners. Exodus has an extremely user-friendly interface that keeps all your important information in one easy-to-navigate location. Users of Exodus also get access to a wide range of cryptocurrency options and access to a built-in exchange. Exodus offers both a desktop and mobile application meaning you can access your online wallet from anywhere. For those looking to start in the crypto game, Exodus can be a great option.

Guarda

The next entry on our list of top online wallets, Guarda, is arguably the most secure online wallet available. Guarda has built its reputation on its security and its user-friendly interface. The site has quite expensive fees, but it is unquestionably one of the top online wallets. Users can expect seamless functionality and top-of-the-line security when trusting their crypto with Guarda. 

Coinbase

Another great option for users to consider is Coinbase. Coinbase is a great option for serious traders who are willing to upgrade to the Coinbase Pro service. The basic Coinbase is very easy to navigate and offers great security. But it also has somewhat expensive and confusing transaction fees that make it difficult for casual traders. Coinbase Pro users get access to detailed market history and statistics, as well as reduced trade fees. For those looking to get serious about cryptocurrency, Coinbase Pro may be the perfect online wallet for you.

 Crypto.com

Crypto.com is a great, well-rounded option for anyone looking to join crypto. The fees are somewhat expensive, but the interface is extremely easy to navigate, and Crypto.com has one of the largest varieties of crypto trading on its platform. Not to mention Crypto.com has built an outstanding reputation for protecting its clients’ investments and information. One of the top options all around, it is easy to see why millions are using Crypto.com

Mycelium 

One of the top options for mobile crypto wallets comes from Mycelium. With over a decade in the crypto game, Mycelium is one of the most trustworthy online wallets available. The wallet only comes in the form of mobile apps, making it a great choice for Apple and Android users alike. The mobile apps allow for easy trading on the go. Mycelium offers users four different transaction fees that vary based on the size and type of transaction. All in all, Mycelium is a great mobile option for those looking for online wallets.

Electrum

The next entry on our list, Electrum, is one of the top online wallets for Bitcoin traders. Electrum is an open-source platform that has enabled users to trade Bitcoin for over a decade now. One of the biggest advantages of using Electrum is that users can make Bitcoin transactions through a lightning connection. The lightning connection allows users to make transactions faster, and for a lower fee. Unfortunately, Electrum is not super user-friendly, and customer support is not one of the platform’s strong suits. While maybe not for everyone, Electrum is a great option for savvy Bitcoin traders.

Blockchain

Last but not least comes the online wallet from Blockchain. With over 80 million crypto traders on the platform, Blockchain is easily one of the most popular online wallets today. With access to multiple crypto options, a secure site, and an enticing reward plan it is easy to see why so many people use Blockchain to store their crypto.

The post Top 7 Online Wallets for Storing Crypto appeared first on The Tech Report.

]]>
Coinbase.com Pros and Cons You Need to Know https://techreport.com/review/coinbase-com-pros-and-cons-you-need-to-know/ Mon, 11 Jul 2022 16:42:40 +0000 https://techreport.com/?p=3478081 coinbase two gold crypto coins

If you, like most people, then you probably hear about cryptocurrency all the time. Whether it’s in the news, on social media, or from that one friend that just won’t...

The post Coinbase.com Pros and Cons You Need to Know appeared first on The Tech Report.

]]>
coinbase two gold crypto coins

If you, like most people, then you probably hear about cryptocurrency all the time. Whether it’s in the news, on social media, or from that one friend that just won’t stop talking about it. Crypto is everywhere. Even tech billionaire Elon Musk is in the cryptocurrency game. And while it may seem like crypto receives too much of the spotlight sometimes, it is warranted attention. Thanks largely to crypto, there are more millionaires than ever. And it’s not only the tech moguls who are benefitting, regular people all around the world are having their lives change overnight thanks to some savvy crypto investing. If investing in crypto is something you are interested in doing, then you may inevitably come across Coinbase.com. In this review, we will cover some of the pros and cons of using Coinbase.com.

Pros

Extremely User Friendly 

One of the biggest benefits of using Coinbase.com is the website’s extremely user-friendly software. Especially in the case of people who are new to the crypto game, using it can make trading much more simple. Both the website and companion app have easy-to-navigate platforms that enable crypto trading with the click of a button. 

Coinbase Pro

While the platform does offer trading on their site without Pro, we highly recommend you check out if Coinbase Pro is right for you. With Coinbase Pro you receive detailed statistics, market history, and portfolio guidance. This information helps you not only track your current investments but also to better plan and strategize for the long term. For any serious crypto investor using the platform, Coinbase Pro is a must-have.

Massive Selection

One of the largest pros of using Coinbase is its massive selection of cryptocurrencies. Users on the platform can find any cryptocurrency, anything from Dogecoin to Bitcoin can be found on Coinbase. Platform users can build a massive portfolio of diverse options.

Top of the Line Security

Utilizing Coinbase allows users to feel 100% confident that their finances are secure. The platform has built its brand on its security, and with massive profits on the line, security is a top priority. With most of the money on the site being stored in various locations worldwide, users can rest easy knowing they use the platform.

Cons

Customer Service

Unfortunately, thanks to the size of the software, customer service runs into issues at times with Coinbase.com. With millions of active daily users, customer service can be slow and inefficient sometimes.  Some users reported extremely long wait times on calls or communications with the customer service team. While the site is very user-friendly this can provide a barricade for some new platform users. The issue isn’t an uncommon one but prepare for some tedious phone calls and communications with customer service needs.

Expensive Fees

Unfortunately for casual users, the regular Coinbase platform comes with sometimes costly trading fees. Especially when conducting smaller transactions, these fees can add up quickly and really hurt user profits. All platform users have the option to upgrade to Coinbase Pro for cheaper fees, but for casual traders, this subscription fee can also be an annoying burden.

Confusing Fees

The transaction fees can be confusing as well as expensive. Coinbase oddly uses a structured fee system that charges more for smaller transactions. But the fees also aren’t consistently structured so casual traders can lose huge profits with misunderstood fees. Coinbase Pro simplifies some of these transaction fees, but again not everyone will want to upgrade to Coinbase Pro.

Coinbase Pro

After beating around the bush it’s time to finish off our list of cons by addressing the elephant in the room—Coinbase Pro. While Coinbase Pro is a great addition to the site for serious traders and investors, it adds a barrier for casual and new traders. Without the Coinbase Pro subscription, users receive a severely limited platform. While we enjoy Coinbase Pro, we recognize that it may not be a good option for everyone.

Final Review

Overall, we have decided to rank Coinbase.com an 8.3/10. Coinbase has a great website interface, a selection of cryptocurrencies, and an outstanding reputation for security. Coinbase Pro also brings a unique level of detail and insight to those who are serious about crypto trading. Detailed statistics and history reports can help elevate your investing.

Unfortunately, Coinbase Pro also provides a bit of an entry wall for new or casual traders. Without the subscription service trade fees can be expensive and confusing. The site also runs into some issues and delays when dealing with customer service. All in all, though, we feel Coinbase is an extremely strong cryptocurrency platform. And we highly recommend you check the platform out for yourself.

The post Coinbase.com Pros and Cons You Need to Know appeared first on The Tech Report.

]]>
The Rise of Reusable Notebooks: The Best Ones to Look For https://techreport.com/gadget-digest/reusable-notebooks/ Thu, 16 Jun 2022 15:08:30 +0000 https://techreport.com/?p=3477719 reusable notebooks

The world we live in today is dominated by new tech releases and technological innovations. While we’re lucky to live in such a technologically advanced society, there are drawbacks. The...

The post The Rise of Reusable Notebooks: The Best Ones to Look For appeared first on The Tech Report.

]]>
reusable notebooks

The world we live in today is dominated by new tech releases and technological innovations. While we’re lucky to live in such a technologically advanced society, there are drawbacks. The biggest is the negative environmental impacts our technology and production have on our surroundings. With this issue on the horizon, many brands and consumers alike are looking for more sustainable products and production. One such product that is on the rise is reusable notebooks. 

One way people are looking to reduce their environmental footprint is by cutting back on paper waste. Now, you could write on both sides of the page, or you could try recycled eco-friendly notebooks. However, in our opinion, the most compelling option is to look into reusable notebooks.

Reusable notebooks allow you the feeling of writing and erasing on paper without all the excess paper waste. Here are some of the top options available for reusable notebooks.

Rocketbook Core Smart Notebook

A great affordable option, the Rocketbook Core Smart Notebook comes in at just $23. The reusable smart notebook comes in a variety of colors and can be found at your local Walmart. By using an erasable/refillable FriXion pen, users can write and bind their writing to the notebook. The pages are made of a creative polyester compound that requires no trees and gives an authentic writing experience.

Users can then save their work, and using the Rocketbook app they can transmit their notes to any cloud-based service like Google Drive or Dropbox. To get a fresh page users can then just gently wipe down their screen and get a blank page. No matter what you need the notes for, the Rocketbook Core Smart Notebook is a truly great option.

Boogie Board Blackboard Letter Paperless Notepad

Our next product, the Boogie Board Blackboard Letter, is the perfect versatile option for young students. While this product is great for all ages, its sturdy and durable design is ideal for children who aren’t always careful with their tech. There is no necessary stylus or pen, meaning children can easily use their hands to draw and write.

Users can transfer their notes using the iOS and Android compatible app, and can easily erase with a single button press. The reusable notebook also boasts a 5-year eco-friendly battery. Because of its durability and simplicity, this product is the perfect notebook for younger children.

Homestec Reusable Smart Notebook

Our next option on the list is great for those on a budget. With a price point of just $14, the Homestec Reusable Smart Notebook is an extremely versatile and cost-effective option. The notebook comes in a 6.7 by 4.2-inch size and offers a choice between dotted or lined pages.

Each page can be used and erased up to 500 times, and the ink dries within 15 seconds. This means that you can write quickly and freely without fear of overusing the notebook. Users can use either a pen tip eraser or blow dryer to clean off their pages.

Customers can also quickly download and transfer their notes to the cloud with the CamScanner app. Upon purchase, users will receive a usable pen and a set of sticky notes. With the low price and versatile features, this is one of the top options on the market.

Elfinbook 2.0 Smart Reusable Notebook

Next up on the list, the Elfinbook 2.0 is a nice throwback for those who grew up using a spiral notebook. The notebook features erasable double-side pages that users can wipe with a wet cloth. Users can also clear their pages using a microwave or hairdryer as the ink clears with heat. Coming in two sizes customers can choose the notebook that works best for them.

The notebook works in companion with the Elfinbook app so that users can safely transfer and store their drawings and notes. While not the cheapest option on the market, the Elfinbook 2.0 is still relatively affordable at its current price. For those who love the nostalgia of the spiral notebook, this is the perfect reusable notebook option.

Rocketbook Everlast Mini

The final entry on our list of top reusable notebooks is the Rocketbook Everlast Mini. Certainly, this is the perfect option for anyone looking for a compact and easy-to-carry reusable notebook. The mini has 48 reusable pages and can fit inside your pocket. Like the larger options from Rocketbook, the Everlast Mini comes in a variety of color options. Also, it requires a special FriXion pen to write, and users can easily erase using a damp cloth.

Customers can again utilize the Rocketbook app to save and transfer their notes at the press of a button. For those looking for a pocket-size travel companion, the Rocketbook Everlast Mini is an outstanding option.

The post The Rise of Reusable Notebooks: The Best Ones to Look For appeared first on The Tech Report.

]]>
The Top 6 Smart Doorbells to Secure Your Home https://techreport.com/review/smart-doorbells/ Wed, 15 Jun 2022 15:47:51 +0000 https://techreport.com/?p=3477739 smart doorbell

One of the largest trends of the 21st century is the shift to online shopping and e-commerce. In many ways, this shift has made our lives much easier and made...

The post The Top 6 Smart Doorbells to Secure Your Home appeared first on The Tech Report.

]]>
smart doorbell

One of the largest trends of the 21st century is the shift to online shopping and e-commerce. In many ways, this shift has made our lives much easier and made our shopping more convenient. However, this also means that there are more strangers coming by our house. Thankfully, we have more options than ever before for protecting our homes. One such option is smart doorbells.

Not only online shopping, but online services like Uber and DoorDash mean that more people than ever before are coming to your living space. For the most part, we enjoy having access to instant services like these, but it does add a little wrinkle of worry about our safety at home.

Here are our top 6 smart doorbell options to help secure your home:

1. Ring Video Doorbell Pro

Our first entry on our list is one of the top smart doorbells on the market today. The Ring Video Doorbell Pro is a wired device that comes with a vast array of features. Users with the Ring Video Doorbell Pro will have access to a dual-band Wi-Fi connection, and the camera can even provide nighttime color. The smart doorbell even gives the option for automated Alexa greetings at the door.

Users should note that the Ring Video Pro requires an existing doorbell system for power supply. Currently sitting at $139.99, the Ring Video Doorbell comes with free shipping and a free trial to the Ring Protect Plan for home security. All in all, this is one of the top-end smart doorbells on the market.

2. Wyze Video Doorbell Pro

Next on the list is the Wyze Video Doorbell Pro. Unlike the Ring Doorbell Video Pro, the Wyze Pro system is a wireless smart doorbell system. The camera provides users with 2k camera technology and a 150-degree field of view. The doorbell camera also comes with cloud storage services and includes a plug-in chime. The wireless battery has a 6-month life meaning that you can set it up and leave it for quite a long time.

While the features are impressive, the best part about this Wyze system is its affordable price at under $100. For the cost, you won’t be able to find a better wireless option.

3. Ring Peephole

If you live in an apartment, don’t worry, we didn’t forget about you. The Ring Peephole is the perfect option for those who can’t remove their door from the hinges in order to install their smart camera system. Completely wireless, the Ring Peephole allows users all the benefits of a normal peephole and more. Ring Peephole allows users to not only look through the peephole but also to talk to visitors and pull up video feeds at any time.

Customers can purchase the ring security system along with it to unlock additional features and services. Overall, the Ring Peephole doesn’t offer the most excessive features, but it is a great option for those who live in an apartment.

4. Nest Doorbell

Next on our list comes the smart doorbell from Google, the Nest Doorbell. Coming in at a price of around $180, this isn’t the cheapest option on the market. It also does not have some of the more high-end features of some other smart door systems. However, the biggest advantage of the Nest Doorbell is how easy it is to use. Because it is completely wireless, the Nest Doorbell can easily attach to any door in any home.

By purchasing a security subscription like Nest Aware, users can unlock some extra features that don’t come with the standard purchase. The easy installation and widespread compatibility make the Nest Doorbell perfect for those looking to try a smart doorbell for the first time.

5. Arlo Video Doorbell

The Arlo Video Doorbell is one of the best and most versatile options on the market. Customers have access to a wide array of features that include HD quality, two-way audio, motion detector technology, and even a built-in siren. Users can purchase a subscription service that enables cloud capability so that they can go back and review footage at any time. The wired system can even provide customized alerts for packages or guests.

All in all, the Arlo Video Doorbell is one of the best smart doorbell systems available and can provide you with extra peace of mind.

6. Ring Video Doorbell Pro 2

Last, but not least, the Ring Video Doorbell Pro 2 is arguably the most high-end doorbell camera on the market. It comes at a steep $260 price tag but has some of the best features available. The camera has a massive field of view and can record at up 1536p quality. The smart doorbell has video encryption technology to ensure your recordings are safe and has any other features you could want.

However, what truly makes the Pro 2 stand out is its birds-eye-view mode. This grants the user the ability to track movement across the property. If cost isn’t an object then this is an outstanding home security option.

The post The Top 6 Smart Doorbells to Secure Your Home appeared first on The Tech Report.

]]>
Full Review of AMD Radeon HD 7970 GHz Edition https://techreport.com/review/amds-radeon-hd-7970-ghz-edition/ Tue, 31 May 2022 08:41:00 +0000 http://localhost/wordpress/amds-radeon-hd-7970-ghz-edition amd radeon hd 7970 ghz edition

Introduction This article is a full review of the AMD Radeon HD 7970 Ghz Edition. It includes photos, information, tables, graphs, comparisons, costs, game tests, and more. The Review In...

The post Full Review of AMD Radeon HD 7970 GHz Edition appeared first on The Tech Report.

]]>
amd radeon hd 7970 ghz edition

Introduction

This article is a full review of the AMD Radeon HD 7970 Ghz Edition. It includes photos, information, tables, graphs, comparisons, costs, game tests, and more.

The Review

In the great game of one-upsmanship played by the two major graphics chip makers, one of the most prized goals is being first to market with a new generation of technology. AMD captured that waypoint late last year when it introduced the first 28-nm GPU, the Radeon HD 7970.

However, there are advantages to being later to market, because the competition has already played its hand. Nvidia smartly took advantage of that dynamic when it unveiled the GeForce GTX 680 several months ago. The new GeForce managed—just barely—to outperform the 7970, while consuming less power and bearing a price tag $50 lower than the Radeon’s. Nvidia couldn’t have pulled off that trifecta if not for the efficiency of its Kepler architecture, of course, but knowing the target surely helped in selecting clock speeds and pricing for the final product. The first reviews of the GTX 680 were uniformly positive, and the narrative was set: Kepler was a winner. Despite being second to market—or, heck, because of it—Nvidia had captured the mojo.

Then an interesting thing happened. Finding a GeForce GTX 680 card in stock at an online retailer became difficult—and the situation still hasn’t eased. Meanwhile, Radeon HD 7900-series cards appear to be plentiful. AMD’s spin on this situation is simply to point out that its cards are more readily available for purchase, which is undeniably true. Nvidia’s take is that it’s selling through GTX 680s as fast as it can get them—and that the problem is raging demand for its products, not just iffy supply. Since both companies rely on the same foundry (TSMC) for their chips, we suspect there’s some truth in Nvidia’s assertions. These things are hard to know for sure, but quite likely, the GTX 680 is outselling the 7970—perhaps by quite a bit.

If so, that’s just a tad insane, given how closely matched the two cards have been in our assessments. Evidently, capturing the mojo is very important indeed.

AMD’s answer to this dilemma is a new variant of the Radeon HD 7970 intended to reclaim the single-GPU performance crown, the awkwardly named Radeon HD 7970 GHz Edition. Compared to the original 7970, the GHz Edition has higher core (1GHz vs. 925MHz) and memory (1500MHz vs. 1375MHz) clock speeds, and it has a new “boost” feature similar to Kepler’s GPU Boost.

To understand the “boost” issue, we have to take a quick detour into dynamic voltage and frequency scaling (DVFS) schemes, such as the Turbo Boost feature in Intel’s desktop processors. AMD was the first GPU maker to introduce a DVFS scheme for graphics cards, known as PowerTune. PowerTune allows AMD to set higher stock GPU clock frequencies than would otherwise be possible within a given thermal envelope. The GPU then scales back clock speeds occasionally for workloads with unusually high demands, to enforce its power limits. Unlike the various Turbo and Boost schemes on other chips, though, PowerTune doesn’t raise clock speeds opportunistically in order to take advantage of any extra thermal headroom—at least, it hasn’t until now.

Like the Turbo Core feature in AMD’s FX processors, PowerTune works by monitoring digital activity counters distributed around the chip and using those inputs to estimate power consumption. These power estimates are based on profiles developed through extensive qualification testing of multiple chips. Somewhat uniquely, AMD claims the behavior of its DVFS schemes is deterministic—that is, each and every chip of the same model should perform the same. Intel and Nvidia don’t make such guarantees. If you get a sweetheart of a Core i5, it may outperform your neighbor’s; better cooling and lower ambient temperatures can affect performance, as well.

For the 7970 GHz Edition, AMD has refined its PowerTune algorithm to improve its accuracy. By eliminating some cases of overestimation, AMD claims, this revamped algorithm both increases the GPU’s clock speed headroom and allows the GPU to spend more time resident at its peak frequency. Furthermore, the 7970 GHz Edition adds an additional P-state that takes the GPU clock beyond its stock speed, to 1050MHz, when the thermal envelope permits. It ain’t much in the grand scheme, but this ability to reach for an additional 50MHz is the 7970 GHz Edition’s “boost” feature—and it is fairly comparable to the GPU Boost capability built into Nvidia’s Kepler.

The higher default clock speeds and the PowerTune wizardry are the sum total of the changes to the GHz Edition compared to the original Radeon HD 7970. GHz Edition cards should still have the same ~250W max power rating, with six- and eight- pin aux power connectors. Above is a picture of our 7970 GHz Edition review unit, which came to us directly from AMD. However, there is a bit of a catch. The card above is based on AMD’s reference design, but we understand retail cards from AMD’s various partners will have custom coolers and possibly custom PCB designs. You won’t likely see a 7970 GHz Edition that looks like that picture.

We’d like to show you a retail card, but those aren’t here yet. AMD tells us the first products should begin showing up at online retailers next week, with “wide availability” to follow the week after that.

Base
clock
(MHz)
Boost
clock
(MHz)
Peak
ROP rate
(Gpix/s)
Texture
filtering
int8/fp16
(Gtex/s)
Peak
shader
tflops
Memory
transfer
rate
Memory
bandwidth
(GB/s)
Price
XFX HD 7950 Black 900 29 101/50 3.2 5.5 GT/s 264 $409
Radeon HD 7970 925 30 118/59 3.8 5.5 GT/s 264 $449
Radeon HD 7970 GHz 1000 1050 34 134/62 4.3 6.0 GT/s 288 $499

Here’s a look at how the 7970 GHz Edition compares to a couple of Radeon HD 7900 cards already on the market. As you can see, the GHz Edition’s higher core and memory clock speeds separate it pretty clearly from the stock 7970 in key rates like pixel fill, texture filtering, shader flops, and memory bandwidth.

In fact, although it’s not listed in the table above, the 7970 GHz Edition is the first GPU to reach the 1 teraflop milestone for theoretical peak double-precision floating-point math throughput. Double-precision throughput is irrelevant for real-time graphics and probably mostly useless for consumer GPU-computing applications, as well. Still, this card hits a target recently mentioned by both Nvidia and Intel as goals for data-parallel computing products coming later this year.

AMD says the GHz Edition will list for $499.99, placing it directly opposite the GeForce GTX 680. We’ve taken the prices for the other two Radeons above from Newegg. Street prices for the Radeon HD 7970 have recently dropped to $449.99, 100 bucks below its introductory price, perhaps in part to make room for the GHz Edition.

We’ve included all three of these cards in this review because they illustrate the current state of the high-end Radeon lineup. Video card makers have more leeway than ever to offer higher-clocked variants of their products, and that means alert enthusiasts can snag some deals by ignoring branding and focusing on specs instead. For example, XFX’s “Black Edition” version of the Radeon HD 7950 is so aggressively clocked that it essentially matches the stock 7970 in pixel throughput rate and memory bandwidth. The XFX 7950 does give up a bit of texel fill rate and shader processing oomph to the stock 7970, but we probably wouldn’t pay the extra 40 bucks for the 7970, given everything.


XFX’s Radeon HD 7950 Black Edition challenges the stock 7970

Comparing to The Competition

The GeForce GTX 600-series lineup hasn’t been sitting still since its introduction, either. Nvidia has long given its partners wide latitude in setting clock speeds, and the resulting cards in this generation are much more attractive than the stock-clocked versions. We’ve lined up several of them to face off against the 7970 GHz Edition and friends, including a pair of ringers from Zotac.


Zotac’s GTX 680 AMP!

If the Radeon HD 7970 GHz Edition wants to own the title of the fastest single-GPU graphics card, it’ll have to go through Zotac’s GeForce GTX 680 AMP! Edition. At $549.99, the GTX 680 AMP! costs a bit more than the newest Radeon, but what’s 50 bucks in this rarefied air? You will also have to accept the potential clearance issues created by the heatpipes protruding from the top of Zotac’s custom cooler and the fact that this thing eats up three expansion slots in your PC. In return, the GTX 680 AMP! is a pretty substantial upgrade over the stock GTX 680.


Yep, this is a different card: Zotac’s GTX 670 AMP!

Believe it or not, Zotac’s GeForce GTX 670 AMP! is also an upgrade over the stock GeForce GTX 680. Yes, the GK104 graphics processor in the GTX 670 has had a mini-lobotomy—one of its eight SMX units disabled—but Zotac more than makes up for it with aggressive core and memory clocks. Have a look at the numbers.

Base
clock
(MHz)
Boost
clock
(MHz)
Peak
ROP rate
(Gpix/s)
Texture
filtering
int8/fp16
(Gtex/s)
Peak
shader
tflops
Memory
transfer
rate
Memory
bandwidth
(GB/s)
Price
Zotac GTX 670 AMP! 1098 1176 38 132/132 3.2 6.6 GT/s 211 $449
GeForce GTX 680 1006 1058 34 135/135 3.3 6 GT/s 192 $499
Zotac GTX 680 AMP! 1111 1176 38 151/151 3.6 6.6 GT/s 211 $549

Assuming the GPU typically operates at its Boost clock speed—and that seems to be a solid assumption to make with GK104 cards—then Zotac’s GTX 670 AMP! nearly matches the stock GTX 680 in texture filtering and shader flops. Since the GTX 670 silicon isn’t hobbled at all in terms of memory interface width or ROP count, the AMP! matches its bigger brother in terms of pixel fill rate (which corresponds to multisampled antialiasing power) and memory throughput, surpassing the stock GTX 680. On paper, at least, I’d expect the GTX 670 AMP! to outperform a stock GTX 680, since memory bandwidth may be the GK104’s most notable performance constraint.

Along those lines, notice that the fastest cards above have “only” 211 GB/s of memory throughput, while the 7970 GHz Edition is rated for 288 GB/s. That’s a consequence of the fact that Nvidia’s GK104 is punching above its weight class. This middleweight Kepler only has a 256-bit memory interface. The Tahiti chip driving the Radeon HD 7900-series cards is larger and sports a 384-bit memory interface. By all rights, AMD ought to be able to win this contest outright. The fact that folks are buying up GTX 680 cards for 500 bucks or more is vaguely amazing, given the class of hardware involved. But, as we’ll see, the performance is there to justify the prices.

Game Testing the AMD Radeon HD 7970 GHz edition

Test notes

Before we dive into the test results, I should mention a couple of things. You will notice on the following pages that we tested games at very high resolutions and quality levels in order to stress these graphics cards appropriately for the sake of our performance evaluation. We think that’s appropriate given the task at hand, but we should remind you that a good PC gaming experience doesn’t require a $450+ video card. We’ve hand-picked games that are especially graphically intensive for our testing. Not every game is like that.


Diablo III doesn’t need extreme GPU horsepower

For instance, we wanted to include Diablo III in our test suite, but we found that, on this class of graphics card, it runs at a constant 100 FPS with its very highest image quality settings at our monitor’s peak 2560×1600 resolution. Diablo III is a pretty good looking game, too, but it’s just no challenge.

Along the same lines, we have tested practically everything at a resolution of 2560×1600. We realize that 1080p displays are the current standard for most folks and that they’re much more widely used than four-megapixel monsters. Here’s the thing, though: if you’re going to fork over the cash for a $500 video card, you’ll want a high-res display to pair with it. Maybe one of those amazingly priced Korean 27″ monitors. Otherwise, the video card will probably be overkill. In fact, as we were selecting the settings to use for game testing, the question we asked more often was whether we shouldn’t be considering a six-megapixel array of three monitors in order to properly stress these cards. Also, in the odd case where we did think 1920×1080 might be appropriate, we found that the beta Nvidia drivers we were using didn’t expose that resolution as an option in most games.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and we’ve reported the median result.

Our test systems were configured like so:

Processor Core i7-3820
Motherboard Gigabyte
X79-UD3
Chipset Intel X79
Express
Memory size 16GB (4 DIMMs)
Memory type Corsair
Vengeance CMZ16GX3M4X1600C9
DDR3 SDRAM at 1600MHz
Memory timings 9-9-11-24
1T
Chipset drivers INF update
9.3.0.1019
Rapid Storage Technology Enterprise 3.0.0.3020
Audio Integrated
X79/ALC898
with Realtek 6.0.1.6526 drivers
Hard drive Corsair
F240 240GB SATA
Power supply Corsair
AX850
OS Windows 7 Ultimate x64 Edition
Service Pack 1
DirectX 11 June 2010 Update
Driver
revision
GPU
base
core clock
(MHz)
Memory
clock
(MHz)
Memory
size
(MB)
Zotac
GeForce GTX 570
GeForce
304.48 beta
732 950 1280
Zotac
GTX 670 AMP!
GeForce
304.48 beta
1098 1652 2048
GeForce
GTX 680
GeForce
304.48 beta
1006 1502 2048
Zotac
GTX 680 AMP!
GeForce
304.48 beta
1111 1652 2048
Radeon
HD 6970
Catalyst
12.7 beta
890 1375 2048
XFX
Radeon HD 7950 Black
Catalyst
12.7 beta
900 1375 3072
Radeon
HD 7970
Catalyst
12.7 beta
925 1375 3072
Radeon
HD 7970 GHz Edition
Catalyst
12.7 beta
1000 1500 3072

Thanks to Intel, Corsair, and Gigabyte for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following test applications:

Some further notes on our methods:

  • We used the Fraps utility to record frame rates while playing either a 60- or 90-second sequence from the game. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We tested each Fraps sequence five times per video card in order to counteract any variability. We’ve included frame-by-frame results from Fraps for each game, and in those plots, you’re seeing the results from a single, representative pass through the test sequence.
  • We measured total system power consumption at the wall socket using a Yokogawa WT210 digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Arkham City at with DirectX 11 at 2560×1600 with FXAA enabled.
  • We measured noise levels on our test system, sitting on an open test bench, using an Extech 407738 digital sound level meter. The meter was mounted on a tripod approximately 10″ from the test system at a height even with the top of the video card.You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.
  • We used GPU-Z to log GPU temperatures during our load testing.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Battlefield 3

We tested Battlefield 3 with all of its DX11 goodness cranked up, including the “Ultra” quality settings with both 4X MSAA and the high-quality version of the post-process FXAA. Our test was conducted in the “Kaffarov” level, for 60 seconds starting at the first checkpoint.

Let me apologize in advance for what follows, because it is a bit of a data dump. We’re about to do some unusual things with our test results, and we think it’s best to show our work first. The plots below come from a one of the five test runs we conducted for each card.

Frame time
in milliseconds
FPS
rate
8.3 120
16.7 60
20 50
25 40
33.3 30
50 20

Yep, those plots show the time required to produce each and every frame of the test run. Because we’re reporting frame times in milliseconds, lower numbers are better. If you’re unfamiliar with our strange new testing methods, let me refer you to my article Inside the second: A new look at game benchmarking for an introduction to what we’re doing.

The short version is that we’ve decided traditional FPS averages aren’t a very good indicator of the fluidity of animation in real-time graphics. The problem isn’t with reporting things as a rate, really. The problem is that nearly every utility averages frame production rates over the course of a full second—and a second is a very long time. For example, a single frame that takes nearly half a second to render could be surrounded by frames that took approximately 16 milliseconds to render—and the average reported over that full second would be 35 FPS, which sounds reasonably good. However, that half-second wait would be very disruptive to the person attempting to play the game.

In order to better understand how well a real-time graphics system works, we need to look closer, to use a higher-resolution timer, if you will. Also, rather than worrying about simple averages, we can consider the more consequential question of frame latencies. What we want is consistent production of frames at low latencies, and there are better ways to quantify that sort of thing. Since we’re going to be talking about frame times in milliseconds, I’ve included a handy table on the right that offers conversions from milliseconds to FPS.

We’ll start by reporting the traditional FPS average. As you can see, the Radeon HD 7970 GHz Edition just outperforms the stock GeForce GTX 680 in this metric, although the difference is too small to worry about, really.

If we’re thinking in terms of frame latencies, another way to summarize performance is to look at the 99th percentile frame time. That simply means we’re reporting the threshold below which 99% of all frames were rendered by each card. We’ve ruled out that last 1% of outliers, and the resulting number should be a decent indicator of overall frame latency. As you can see, the differences between the top few cards are even smaller by this measure.

A funny thing happens, though, to our two legacy cards, the GeForce GTX 570 and the Radeon HD 6970. Although the GTX 570 has a slightly higher FPS average, its 99th percentile frame time is much higher than the 6970’s. Why? Well, the 99th percentile is just one point on a curve, so we shouldn’t make too much of it without putting it into context. We can plot the tail end of the latency curve for each card to get a broader picture.

The GeForce GTX 570 is faster than the Radeon HD 6970 most of the time, until we get to the last 3% or so of the frames being produced. Then the GTX 570 stumbles, as frame times shoot toward the 100-millisecond mark. Scroll up to the frame time plots above, and you can see the problem. The GTX 570’s plot is spiky, with a number of long-latency frames interspersed throughout the test run. This is a familiar problem with older Nvidia GPUs in BF3, though it appears to have been resolved in the GK104-based cards.

In fact, all of the newer cards are nearly ideal performers, with nice, straight lines in the high 20- and low 30-millisecond range. They only curve up modestly when we reach the last one or two percentage points.

You’ll recall that our 99th percentile frame time measurement ruled out the last 1% of long-latency frames. That’s useful to do, but since this is a real-time application, we don’t want to ignore those long-latency frames entirely. In fact, we want to get a sense of how bad it really is for each card. To do so, we’ve concocted another measurement that looks at the amount of time spent working on frames for which we’ve already waited 50 milliseconds. We’ve chosen 50 ms as a threshold because it corresponds to 20 FPS, and somewhere around that mark, the illusion of motion seems to be threatened for most people. Also, 50 ms corresponds to three full vertical refresh intervals on a 60Hz display. If you’re gaming with vsync enabled, any time spent beyond 50 ms is time spent at 15 FPS or less, given vsync’s quantization effect.

Predictably, only the two legacy cards spend any real time beyond 50 ms, and only the GeForce GTX 570 really has a problem. The GTX 570 has a real but not devastating issue here; it spends 1.4 seconds out of our 60-second test session working on long-latency frames. Our play-testing sessions on this card felt sluggish and clumsy. Remember, though: when we started, the GTX 570 had a higher FPS average than the Radeon HD 6970. That FPS number just turns out to be pretty meaningless.

Max Payne 3

Max Payne 3 is a new addition to our test suite, and we should note a couple of things about it. As you’ll notice in the settings image above, we tested with FXAA enabled and multisampling disabled. That’s not the most intensive possible setting for this game, and as you’ll soon see, Max 3 runs quite quickly on all of the cards we’ve tested. We wanted to test with MSAA, but it turns out multisampling simply doesn’t work well in this game. Quite a few edges are left jagged. Even the trick of combining MSAA with FXAA doesn’t seem to work here. Enabling both disables FXAA, somehow. We couldn’t see the point of stressing the GPUs arbitrarily while lowering image quality, so we simply tested with the highest quality setting, which in this case was FXAA.

Also, please note that this test session wasn’t as exactly repeatable as most of our others. We had to shoot and dodge differently each time through, so there was some natural variation from one run to the next, although we kept to the same basic area and path.

All of these plots look really good. Remember, they only come from a single test run with some inherent variation, so those spikes on a few cards aren’t necessarily a major problem.

Can you feel the parity? The 7970 GHz just edges out the stock GeForce GTX 680 in the FPS sweeps, only to tie it in the 99th percentile frame time results. Let’s see if the larger latency picture tells us anything new.

Nope. This time, there’s very little drama involved. The reality is that, on all of the newer cards, the vast majority of the frames are produced in under 16.7 milliseconds—or over 60 FPS, if you will. That means, on a 60Hz monitor, all of the newer cards have a frame ready at almost every single display refresh interval.

Our typical measure of “badness” breaks down here, since none of the cards spend any time working on frames over 50 milliseconds. In fact, ratcheting things down to 33.3 milliseconds (the equivalent of 30 FPS) produces a big goose egg, too. Only when we take the next step down to 16.7 ms do we have any separation between the cards, and then it’s only the older ones that show us anything.

DiRT Showdown

We’ve added the latest entry in the DiRT series to our test suite at the suggestion of AMD, who has been working with Codemasters for years on optimizations for Eyefinity and DirectX 11. Although Showdown is based on the same game engine as its predecessors, it adds an advanced lighting path that uses DirectCompute to allow fully dynamic lighting. In addition, the game has an optional global illumination feature that approximates the diffusion of light off of surfaces in the scene. We enabled both the new lighting path and global illumination in our tests.

This is a fantastic game, by the way. My pulse was pounding at the end of each 90-second test run.

Well, I suppose this is what happens sometimes when a GPU maker works closely with a game developer to implement some new features. Showdown simply runs better on Radeons than on GeForces, and it’s not even close. We’ve seen lots of similar scenarios in the past where Nvidia took the initiative and reaped the benefits. Perhaps this is karmic payback.

The GeForces are just overmatched here. You’d want to dial back the image quality settings or lower the resolution to play Showdown on any of the GeForces, including the Zotac GTX 680 AMP! card. The GTX 570 is nearly unplayable, although I did my best to muddle through the testing.

The Elder Scrolls V: Skyrim

Our test run for Skyrim was a lap around the town of Whiterun, starting up high at the castle entrance, descending down the stairs into the main part of town, and then doing a figure-eight around the main drag.

We set the game to its “Ultra” presets with 4X multisampled antialiasing. We then layered on FXAA post-process anti-aliasing, as well, for the best possible image quality without editing an .ini file. We also had the high-res texture pack installed, of course.

Even at the game’s highest image quality settings, our Skyrim scenario doesn’t challenge any of the newer cards much. The parity at the 99th percentile frame time is pretty remarkable and suggests a CPU limitation may be coming into play in the toughest last few percent of frames.

The only card that struggles at all here is the GeForce GTX 570, and we suspect that it’s bumping up against some VRAM size limitations; it has the smallest video memory capacity of the bunch.

Batman: Arkham City

We did a little Batman-style free running through the rooftops of Gotham for this one.

We’re used to seeing those latency spikes in our Arkham City test sequence. We’re moving through a very large and detailed cityscape, and the game engine has to stream in new areas as we glide into them.

Remember what I said about karmic payback? Here’s an excellent game that happens to run better on GeForce cards—and Nvidia worked with developer Rocksteady Studios on it.

The Radeons’ 99th percentile frame times are relatively high given their FPS averages; even the 7970 GHz Edition falls behind the GeForce GTX 570. Why?

The Radeon’s latency curves shoot upward for the last 5-7% of frames. You can spot the problem in the plots up above. Although the plots for the GeForces show quite a few latency spikes, the spikes are more frequent on the Radeons. That extra helping of long frame times puts the Radeons at a disadvantage.

Our measure of “badness” captures the scope of the problem. The 7900-series Radeons spend two to three times as long working on especially high-latency frames as the GeForces do. Interestingly, though, the 7970 GHz Edition avoids these slowdowns much more effectively than the stock 7970 and XFX 7950 do. Perhaps the new, more aggressive PowerTune algorithm is paying off here.

Crysis 2

Our cavalcade of punishing but pretty DirectX 11 games continues with Crysis 2, which we patched with both the DX11 and high-res texture updates.

Notice that we left object image quality at “extreme” rather than “ultra,” in order to avoid the insane over-tessellation of flat surfaces that somehow found its way into the DX11 patch. We tested 60 seconds of gameplay in the level pictured above, where we gunned down several bad guys, making our way across a skywalk to another rooftop.

The 7970 GHz Edition takes first place in the FPS average results, but it’s in third when it comes to the more latency-focused 99th percentile frame time. You spotted the reason in the plots, right? There are periodic spikes on all of the Radeons, spikes that are missing from the GeForce plots.

A broader look at the latency picture reveals that it’s just a sliver of the overall frames, the last 1% or so, that cause trouble for the Radeons. The GeForces are exemplary, by contrast.

There’s not much reason to fret about the occasional high frame times on the Radeons, though. Even the slowest card of the bunch spends less than a fifth of a second on long-latency frames throughout the 60-second test run. That nicely backs up our subjective impression that most of the cards handled this test scenario reasonably well.

Power consumption

We like to test power draw under load by running a real game rather than a synthetic worst-case, power-hog application. This time around, we chose Arkham City to generate that load. Turns out that game induces higher power draw than Skyrim, which we’ve used in the past, or Max Payne 3.

Forgive me for leaving out the two older cards here. Time limits prevented us from testing them for power and noise.

For the unfamiliar, the Radeon HD 7900 series has the ability to drop into a special low-power state, called ZeroCore Power, when the display goes into power-save mode. In this state, the GPU’s power draw drops to just a few watts, and the video card’s cooling fans stop spinning. That’s why the Radeons are so much more efficient in the first set of results above. Otherwise, at idle, the GPUs are more or less at parity.

Interesting. These results are pretty different from what we saw when we used Skyrim to generate the load. Really didn’t expect to see the stock 7970 drawing less power than the GeForce GTX 680. We may have to use multiple games next time around, if time permits.

Regardless, the 7970 GHz Edition draws quite a bit more power than the stock 7970.

Noise levels and GPU temperatures

ZeroCore power confers a slight advantage on the Radeons in the noise department when the display is off. Otherwise, the stock coolers from AMD and Nvidia are pretty evenly matched, and the custom coolers from XFX and Zotac are a bit louder at idle.

The big winners here are the two Zotac cards, whose triple-slot cooler manages to maintain by far the lowest temperatures and the lowest noise levels of the bunch. Our 7970 GHz Edition review unit is pretty loud. The saving grace is that, as we’ve noted, you’re not likely to see this exact card on store shelves. The third-party coolers from AMD’s various partners will hopefully be quieter, although they will have quite a bit of heat to dissipate, given the 7970 GHz Edition’s additional power draw.

Final Conclusions and Closing Thoughts

So, has AMD gotten the mojo back? Let’s boil things down to one of our famous value scatter plots to see. As always, we’ve sourced prices from Newegg and Amazon, and the performance results are averaged across all of the games we tested. We’re relying on our 99th percentile frame time metric for our performance summation, but we’ve converted the result to FPS to keep our scatter plot readable. As always, the better values will be positioned closer to the top left corner of the plot.

The Radeon HD 7970 GHz Edition has indeed recaptured the single-GPU performance title for AMD; it’s even faster than Zotac’s GTX 680 AMP! Edition. And at $499.99, the 7970 GHz Edition is unambiguously a better value than the stock-clocked GeForce GTX 680. Everything seems to be going AMD’s way—even our power consumption results turned out to be closer than expected. I’d say that qualifies for mojo reclamation status, but I suppose the market will decide about that one. We’re curious to see whether GTX 680 cards will continue to be scarce once the 7970 GHz Edition lands on store shelves.

Costs

For those of us who are willing to accept something a little less than the top-of-the-line offering, there are some other nice values on our scatter plot, including the notable duo of the Zotac GTX 670 AMP! and the original Radeon HD 7970. Those two cards cost the same and perform very similarly, so they overlap almost completely on our scatter plot. Either of those cards will cost you 50 bucks less than the 7970 GHz Edition, with only a slight drop in overall performance. Given how well all of the newer cards handled our test scenarios, we’d say sensible folks who are shopping in this price range might want to save a few bucks and snag one of those.

With that said, we suspect the story of the 7970 GHz Edition hasn’t been completely told just yet. AMD’s partners haven’t delivered their customized cards, and as we’ve noted, our review unit is more of a reference design than an actual product. We expect to see higher boost clocks and potentially superior custom coolers once the actual cards arrive. Meanwhile, AMD has supplied us with a slew of potentially interesting new material for testing alongside the 7970 GHz Edition, including some GPU computing-focused applications, a couple more games with fancy new effects, and (at long last!) a version of ArcSoft Media Converter capable of using the Tahiti chip’s built-in H.264 video encoding hardware. Comically, we had only four working days to prepare this review, and the new material arrived in our inbox this past Monday evening. I suppose a follow-up article may be in order.

For now, we at least have a fresh reminder of how close the battle for GPU supremacy is in this generation of chips. You can’t go wrong with either team this time around, although the mojo is liable to change hands again at any moment.

I’m not nearly as wordy on Twitter.

The post Full Review of AMD Radeon HD 7970 GHz Edition appeared first on The Tech Report.

]]>
AMD’s Radeon HD 6990 Graphics Card https://techreport.com/review/amds-radeon-hd-6990-graphics-card/ Mon, 30 May 2022 19:00:00 +0000 http://localhost/wordpress/amds-radeon-hd-6990-graphics-card amd radeon hd 6990 antilles

This article is a review of the AMD Radeon HD 6990 graphic card, and provides information, graphs, data, game testing, comparisons, and more. Introduction Dual-GPU graphics cards have always been...

The post AMD’s Radeon HD 6990 Graphics Card appeared first on The Tech Report.

]]>
amd radeon hd 6990 antilles

This article is a review of the AMD Radeon HD 6990 graphic card, and provides information, graphs, data, game testing, comparisons, and more.

Introduction

Dual-GPU graphics cards have always been kind of strange. Technically, one of them is probably the fastest video card on the planet at any given time. For over a year now, for instance, the Radeon HD 5970 has held that title. Yet dual-GPU cards don’t garner loads of attention, for a host of reasons: they’re subject to the same performance and compatibility pitfalls as any multi-GPU configuration, they tend to have rather extreme power and cooling needs, and they’re usually really freaking expensive, to name a few.

Nevertheless, these odd creations are here to stay. Much of the credit for that fact goes to AMD. The company has been carefully refining its dual-GPU products over the past few years, ever since it decided to stop producing as-big-as-you-can-make-them GPUs. Instead, recent flagship Radeons have been based on pairs of mid-sized chips.

The Oddity of the 5970

The Radeon HD 5970 was odd in that it wasn’t the absolute pinnacle of extremeness that one would expect out of this class of product. The card’s sheer length and price tag were both plenty extreme, but its 725MHz default clock speed gave it performance closer to a pair of Radeon HD 5850s than to a pair of higher-end 5870s. The limiting factor there was power draw. AMD had to tune the card conservatively to ensure that it didn’t exceed its 300W—its rated power draw and the max capacity provided by its 6- and 8-pin auxiliary power connectors—even in absolute peak cases. To skirt this limitation somewhat, AMD practically encouraged 5970 owners to venture into 400W territory by overclocking their cards, even going so far as to screen the chips to ensure they would reach clock speeds similar to the Radeon HD 5870’s. It was innovation, of a sort, born of necessity.

Into Antilles

Now the 5970’s successor has arrived in the form of a product code-named Antilles: the Radeon HD 6990, an all-new card based on a pair of the “Cayman” GPUs that power the Radeon HD 6970. Cayman is something of an incremental improvement over the Cypress chip that powered the Radeon HD 5970, so one might expect the 6990 to be an incremental step up from the 5970, as well. That’s not quite the case, for a couple of reasons.

First, AMD has endowed the 6990 with a pair of 8-pin auxiliary power connectors and raised the card’s max power rating to 375W. That gives the card quite a bit more headroom. Second, and more critically, AMD built a power-capping feature into the Cayman known as PowerTune that allows the GPU to monitor its own power draw and ramp back clock speeds if needed to stay within its prescribed power envelope. Although PowerTune doesn’t often limit performance dramatically in typical gaming workloads, we’ve found that it will kick in when synthetic tests push the GPU past its normal bounds. That ability to prevent problems in worst-case scenarios has freed AMD to push for higher default clock speeds without fear of creating problems.

As a result of these and other changes, AMD has set the Radeon HD 6990’s clock speed at 830MHz while leaving all of Cayman’s execution units enabled. Each GPU on the card also has 2GB of GDDR5 memory clocked at 1250MHz, for an effective transfer rate of 5 GT/s. Those numbers put the 6990’s theoretical peak performance right in between what one would expect from a couple of Radeon HD 6950s and a couple of Radeon HD 6970s—not too shabby, to say the least.

Not Satisfied, What’s more?

AMD apparently wasn’t satisfied with that achievement, though. As you may know, all Radeon HD 6900-series cards have a dual-position switch on the top of the card near the CrossFire connector, ostensibly to enable one to switch to a recovery firmware in the event of a failed video BIOS flash attempt. On the 6990, however, moving that switch from its default position (2) to the other one (1) enables access to a hopped-up BIOS. AMD calls it the “Antilles Unlocking Switch for Uber Mode” or—yes, this is happening—AUSUM. Several things happen when your 6990 cards goes into the umlaut-impaired uber mode. The base GPU clock rises to 880MHz, same as the 6970, and the core GPU voltage rises from 1.12V to 1.175V. Also, the board’s PowerTune limit is raised to 450W. You’re essentially overclocking your card when you switch it into uber mode; AMD doesn’t guarantee proper operation for everyone in every system. However, going AUSUM worked just fine with our 6990 sample on our Intel X58-based test system, much like the 5970 did for us at its “suggested” overclocked speed.

If that’s not enough AUSUM-ness for you, AMD has given 6990 users more than enough leeway to get into real trouble. The Overdrive controls in the AMD control panel will allow GPU overclocks as high as 1200MHz, with memory overclocking as high as 1500MHz (or 6 GT/s).

Peak pixel
fill rate
(Gpixels/s)
Peak bilinear
integer texel
filtering rate
(Gtexels/s)
Peak bilinear
FP16 texel
filtering rate
(Gtexels/s)
Peak shader
arithmetic
(GFLOPS)
Peak
rasterization
rate
(Mtris/s)
Peak
memory
bandwidth
(GB/s)
GeForce GTX 560 Ti 26.3 52.6 52.6 1263 1644 128
GeForce GTX 570 29.3 43.9 43.9 1405 2928 152
GeForce GTX 580 37.1 49.4 49.4 1581 3088 192
Radeon HD 6850 24.8 37.2 18.6 1488 775 128
Radeon HD 6870 28.8 50.4 25.2 2016 900 134
Radeon HD 6950 25.6 70.4 35.2 2253 1600 160
Radeon HD 6970 28.2 84.5 42.2 2703 1760 176
Radeon HD 5970 46.4 116.0 58.0 4640 1450 256
Radeon HD 6990 53.1 159.4 79.7 5100 3320 320
Radeon HD 6990 AUSUM 56.3 169.0 84.5 5407 3520 320

With or without the AUSUM switch enabled, the 6990’s specifications are downright staggering. On paper, at least, it’s far and away the fastest consumer graphics card ever. Of course, we’re just adding up the capacities of its two individual GPUs in the table above and assuming the best—perfect scaling—will happen. That’s not always how things work out in the real world, of course, but the 6990 has more than enough extra oomph to overcome less-than-ideal outcomes.

What The Graphic Cards Looks Like, including AMD’s Radeon HD 6990 


The Radeon HD 6970 (top) versus the 6990 (bottom)

The Radeon HD 5970 (top) versus the 6990 (bottom)

Yep, the Radeon HD 6990 is long—just a sliver shy of a full 12″, in fact, inviting all sorts of remarks that are surely beneath us. You will want to check the clearance in your case carefully before ordering up one of these puppies. Even the ridiculously lengthy 5970 is a tad shorter.


Source: AMD.

Beneath the 6990’s massive cooling shroud lies a brand-new board design that, interestingly, places a single blower in the center of the card, above the voltage regulators and the PCIe bridge chip that acts as an interconnect between the two GPUs and the rest of the system. Those VRMs, incidentally, are digital programmable units from Volterra that are unique to Antilles. AMD says they enable lower temperatures and lower power draw than conventional VRMs.


Source: AMD.

The blower is flanked by a pair of heatsinks with copper vapor chambers at their bases. AMD claims that, although this card fits into roughly the same form factor as the 5970, it moves 20% more air with this arrangement. In addition, the firm tell us the thermal interface material between the heatsinks and the GPUs is a special, phase-change variety that offers 8% better performance than the standard gray goo. Take note: if you disassemble your card, you’ll likely have to use regular thermal paste when reassembling it, sacrificing some of its cooling efficiency. We’ve avoided taking ours apart, so far, because we want our power, noise, and temperature readings to track with what you’d see from retail products.

An array of compact Mini-DisplayPort connectors allows the 6990 to sport a rich mix of display outputs while leaving room for a full slot cover of exhaust venting. The 6990, obviously, can drive up to five displays natively. Since it supports DisplayPort 1.2, it can even drive multiple displays simultaneously off of a single output with the help of a DisplayPort hub.

AMD clearly leads the industry on the display output front. The only drawback is the need for adapters to support “legacy” displays with HDMI or DVI inputs. Fortunately, every Radeon HD 6990 will ship with a trio of adapter dongles to convert those Mini-DP ports to serve other standards: one passive HDMI type, one passive single-link DVI type, and one active single-link DVI type. Out of the box, the 6990 should be able to drive a trio of single-link DVI monitors, then. The reason that third adapter is of the “active” variety is that the GPU has a limited number of timing sources for its display outputs. If you’d like to drive more than three “legacy” displays with a 6990, you’ll need additional active adapters. Similarly, driving a second or third dual-link DVI display, such as a 30″ panel, will require additional active, dual-link-capable dongles.

All of this to-do about multiple display is, of course, only an issue because AMD has been pushing its Eyefinity multi-monitor gaming feature so enthusiastically in the past year and a half—and because the 6990 looks like the perfect device for driving large numbers of megapixels. Given the 6990’s five-way output array, AMD has pointed out how naturally this card would support an interesting display configuration: a five-display-wide wall of monitors in portrait orientation. That sounds like a whole lotta bezel area to me, but it’s certainly a bevy o’ pixels.

The Costs

Before we move on to our test results, where you can see exactly how the 6990 performs, there’s just a couple more details to which we should attend. Although the 6990 is being unveiled today, you likely won’t see it selling at online retailers until some time later this week or perhaps early next. When it does arrive, if you’d like to make one your very own, you need only hand over something close to its list price to your favorite online retailer. That price? $699.99.

Gulp.

That’s a lot, but given that the Radeon HD 6970 is selling for about 340 bucks a pop, this single card that has essentially two of ’em onboard isn’t priced at any great premium, believe it or not.

Testing

Our Testing Methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and we’ve reported the median result.

Our test systems were configured like so:

Processor Core i7-980X
Motherboard Gigabyte EX58-UD5
North bridge X58 IOH
South bridge ICH10R
Memory size 12GB (6 DIMMs)
Memory type Corsair Dominator CMD12GX3M6A1600C8
DDR3 SDRAM
at 1600MHz
Memory timings 8-8-8-24 2T
Chipset drivers INF update 9.1.1.1025
Rapid Storage Technology 9.6.0.1014
Audio Integrated ICH10R/ALC889A
with Realtek R2.58 drivers
Graphics
Radeon HD 5970 2GB
with Catalyst 11.4 preview drivers
Dual Radeon HD 6950 2GB
with Catalyst 11.4 preview drivers
Radeon HD 6970 2GB
with Catalyst 11.4 preview drivers
Dual Radeon HD 6970 2GB
with Catalyst 11.4 preview drivers
Radeon HD 6990 4GB
with Catalyst 11.4 preview drivers
MSI GeForce GTX 560 Ti Twin Frozr II 1GB +
Asus GeForce GTX 560 Ti DirectCU II TOP 1GB
with ForceWare 267.26 beta drivers
Zotac GeForce GTX 570 1280MB
with ForceWare 267.24 beta drivers
Zotac GeForce GTX 570 1280MB +
GeForce GTX 570 1280MB
with ForceWare 267.24 beta drivers
Zotac GeForce GTX 580 1536MB
with ForceWare 267.24 beta drivers
Zotac GeForce GTX 580 1536MB +
Asus GeForce GTX 580 1536MB
with ForceWare 267.24 beta drivers
Hard drive WD RE3 WD1002FBYS 1TB SATA
Power supply PC Power & Cooling Silencer 750 Watt
OS Windows 7 Ultimate x64 Edition
Service Pack 1

Thanks to Intel, Corsair, Western Digital, Gigabyte, and PC Power & Cooling for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following test applications:

Some further notes on our methods:

  • Many of our performance tests are scripted and repeatable, but for some of the games, including Battlefield: Bad Company 2 and Bulletstorm, we used the Fraps utility to record frame rates while playing a 60- or 90-second sequence from the game. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We raised our sample size, testing each Fraps sequence five times per video card, in order to counteract any variability. We’ve included second-by-second frame rate results from Fraps for those games, and in that case, you’re seeing the results from a single, representative pass through the test sequence.
  • We measured total system power consumption at the wall socket using a Yokogawa WT210 digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Battlefield: Bad Company 2 at a 2560×1600 resolution with 4X AA and 16X anisotropic filtering. We test power with BC2 because we think it’s a solidly representative peak gaming workload.
  • We measured noise levels on our test system, sitting on an open test bench, using an Extech 407738 digital sound level meter. The meter was mounted on a tripod approximately 10″ from the test system at a height even with the top of the video card.You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.
  • We used GPU-Z to log GPU temperatures during our load testing.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Bulletstorm Test

This game is based on the aging Unreal Engine, but it’s stressful enough on a GPU to still make a decent candidate for testing. We turned up all of the game’s image quality settings to their peaks and enabled 8X antialiasing, and then we tested in 90-second gameplay chunks.

Our single-GPU configs all struggled with this game, as did our pair of GeForce GTX 560 Ti cards in SLI. Those 560s had the least memory of any cards we tested, at 1GB each. Multi-GPU schemes like SLI and CrossFireX have some memory overhead, and we expect that’s what troubled our 560s in this case.

The 6990, meanwhile, goes toe to toe with a thousand-dollar option from Nvidia: a pair of GeForce GTX 580s in SLI. The Nvidia alternative in the same price range as the 6990 would be a pair of GTX 570s, but those are a bit slower. Then again, a couple of 6950s in CrossFireX perform very similarly to the 6990, and flipping the AUSUM switch doesn’t get you much here, either.

F1 2010 Test

F1 2010 steps in and replaces CodeMasters’ previous effort, DiRT 2, as our racing game of choice. F1 2010 uses DirectX 11 to enhance image quality in a few, select ways. A higher quality FP16 render target improves the game’s high-dynamic-range lighting in DX11. A DX11 pixel shader is used to produce soft shadow edges, and a DX11 Compute Shader is used for higher-quality Gaussian blurs in HDR bloom, lens flares, and the like.

We used this game’s built-in benchmarking facility to script tests at multiple resolutions, always using the “Ultra” quality preset and 8X multisampled antialiasing.

The Radeons had a strong showing in the last game, but this is unexpected dominance from AMD. At the highest resolution where the GPU is the primary bottleneck, dual Radeon HD 6950s outrun a couple of GeForce GTX 580s. The 6990 is faster still, and the AUSUM switch nearly moves the 6990 into dual 6970 territory.

Civilization V Test

Civ V has a bunch of interesting built-in tests. Up first is a compute shader benchmark built into Civilization V. This test measures the GPU’s ability to decompress textures used for the graphically detailed leader characters depicted in the game. The decompression routine is based on a DirectX 11 compute shader. The benchmark reports individual results for a long list of leaders; we’ve averaged those scores to give you the results you see below.

Obviously, the green team takes this one. Not every compute shader is the same, but this one runs better on Nvidia’s Fermi architecture than on Cayman. Regardless of the GPU type, though, one thing holds steady: the performance gains from adding a second GPU are real but modest. That’s why the 6990 is in an unusual position, near the bottom of the pack.

In addition to the compute shader test, Civ V has several other built-in benchmarks, including two we think are useful for testing video cards. One of them concentrates on the world leaders presented in the game, which is interesting because the game’s developers have spent quite a bit of effort on generating very high quality images in those scenes, complete with some rather convincing material shaders to accent the hair, clothes, and skin of the characters. This benchmark isn’t necessarily representative of Civ V‘s core gameplay, but it does measure performance in one of the most graphically striking parts of the game. As with the earlier compute shader test, we chose to average the results from the individual leaders.

The 6990 comes out looking pretty good here, but why is the 5970 so much faster? My guess is that this pixel-shader-intensive test is causing the Cayman GPUs to heat up and invoke their PowerTune limits. Without PowerTune, the 5970 is slower in most real gaming scenarios, but it’s quicker here.

Another benchmark in Civ V focuses, rightly, on the most taxing part of the core gameplay, when you’re deep into a map and have hundreds of units and structures populating the space. This is when an underpowered GPU can slow down and cause the game to run poorly. This test outputs a generic score that can be a little hard to interpret, so we’ve converted the results into frames per second to make them more readable.

My, how the tables have turned! The GeForces take three of the top four spots. Why? I have another crackpot theory for you. There’s tremendous geometric complexity in this late-game scene, with a huge number of units in view at once. Nvidia’s Fermi architecture has some real advantages in geometry processing throughput, and I suspect they’re making themselves known here.

StarCraft II Test

Up next is a little game you may have heard of called StarCraft II. We tested SC2 by playing back 33 minutes of a recent two-player match using the game’s replay feature while capturing frame rates with Fraps. Thanks to the relatively long time window involved, we decided not to repeat this test multiple times. The frame rate averages in our bar graphs come from the entire span of time. In order to keep them readable, we’ve focused our frame-by-frame graphs on a shorter window, later in the game.

We tested at the settings shown above, with the notable exception that we also enabled 4X antialiasing via these cards’ respective driver control panels. SC2 doesn’t support AA natively, but we think this class of card can produce playable frame rates with AA enabled—and the game looks better that way.

The demo we used for testing here is newer than any we’ve used before, and Blizzard has made a number of changes to SC2 over time. As a result, this test turns out to be more taxing than some of our past attempts. The GeForces end up looking very good here, not just in terms of higher average frame rates but also in terms of avoiding the lowest valleys. The frame rate minimums for the Radeons, even the AUSUM 6990, are in the teens.

Battlefield: Bad Company 2 Test

BC2 uses DirectX 11, but according to this interview, DX11 is mainly used to speed up soft shadow filtering. The DirectX 10 rendering path produces the same images.

We turned up nearly all of the image quality settings in the game. Our test sessions took place in the first 60 seconds of the “Heart of Darkness” level.

We’ve been see-sawing back and forth between clear wins for AMD and Nvidia, but this game looks like an even match. Two GTX 570s in SLI perform about the same as a 6990 or a pair of 6970s in CrossFireX. Notice, also, the excellent ~40 FPS minimums produced by a single 6970 or GTX 570. Even those single-GPU cards handle Bad Company 2 pretty darn well.

Metro 2033 Test

We decided to test Metro 2033 at multiple image quality levels rather than multiple resolutions, because there’s quite a bit of opportunity to burden these graphics cards simply using this game’s more complex shader effects. We used three different quality presets built into the game’s benchmark utility, with the performance-destroying advanced depth-of-field shader disabled and tessellation enabled in each case.

We’ve seen the same trend in this game for quite a while. As the image quality rises, the Radeons become more competitive. At Metro‘s “Medium” settings, two GTX 570s in SLI are easily faster than the 6990. By the time we reach the “Very high” settings, the opposite is true.

Aliens vs. Predator Test

AvP uses several DirectX 11 features to improve image quality and performance, including tessellation, advanced shadow sampling, and DX11-enhanced multisampled anti-aliasing. Naturally, we were pleased when the game’s developers put together an easily scriptable benchmark tool. This benchmark cycles through a range of scenes in the game, including one spot where a horde of tessellated aliens comes crawling down the floor, ceiling, and walls of a corridor.

For these tests, we turned up all of the image quality options to the max, along with 4X antialiasing and 16X anisotropic filtering.

Wow, not much drama there as the resolution changes. The Radeons are looking relatively strong here, again, though, with the 6990 out ahead of dual GTX 570s.

Power consumption

AMD initially suggested to us that the 6990’s idle power draw should be somewhat lower than the 5970’s, and so it is. That’s not a huge difference, but it is something. Heck, the entire system based on the 6990 only draws 8W more at idle than the same system equipped with a single GeForce GTX 580.

Under load, the 6990 remains reasonable, drawing less power than a pair of GTX 560s in SLI, though it generally outperforms them. There is a price for being AUSUM, though, and apparently it’s about 50 watts. Who knew? Still, the AUSUM 6990 config draws substantially less power than our GeForce GTX 570 SLI system.

Noise levels and GPU temperatures

The 6990 is the loudest solution we tested, both at idle and, more dramatically, when running a game. That difference is especially perceptible when the card is hitting over 58 on the decibel meter. You will notice the difference between the 6990 and the other solutions; it’s quite audible. The 6990 emits a fairly loud hiss, although its pitch and tenor aren’t especially offensive compared to some of the worst solutions we’ve seen over the years.

Dual-card setups have an acoustic advantage, as our results illustrate. With four slots occupied and two full-length coolers, there’s simply more surface area available for heat dissipation. With that said, AMD has apparently tuned the 6990’s cooler fairly aggressively; it has some of the lowest GPU temperatures of the bunch, and you’ll pay for maintaining those by adding a little extra noise.

Conclusions
With a total of just seven games tested, we can ruthlessly boil down the Radeon HD 6990 and its competition to a simple price-performance scatter plot, like so:

We’ve taken the results from the highest resolution or most intensive setting of each game tested, averaged them, and combined them with the lowest prevailing price at Newegg for each of these configurations. Doing so gives us a nice distribution of price-performance mixes, with the best tending toward the upper left and the worst toward the bottom right.

Final Closing Thoughts

At present, in the suite of games we tested, AMD looks to have a performance advantage at several key price points. That may be a little jarring if your expectations were set several months ago, when we had something close to parity between red and green. We believe AMD has come by it honestly, delivering some impressive performance gains in recent driver releases. One of those changes, AMD tells us, is a revised resolve mechanism for multisampled antialiasing that improves frame rates generally when MSAA is in use—like in nearly all of our test scenarios—particularly in games that use deferred shading schemes. AMD’s driver developers have made some notable progress in CrossFireX multi-GPU performance scaling, too. SLI scaling has long been one of the hallmarks of Nvidia’s SLI, but AMD has closed that gap in recent months.

Of course, both of these changes benefit the Radeon HD 6990, which has no equal in a single-card package. This is the planet’s fastest single video card, supplanting the Radeon HD 5970 that came before it. The 6990 is even faster than two GeForce GTX 570 cards in SLI, which cost about the same amount, and the 6990 draws less power under load, even in AUSUM uber mode. Add in the 6990’s rich array of display outputs, and there’s no question Nvidia is severely outclassed at this lofty $700 price point. We just hope the 6990 isn’t quite as difficult to find over the next year as the Radeon HD 5970 was during much of its run. We do believe TSMC’s 40-nm supply problems are behind us, so we’re optimistic on that front.

Having said that, we can’t help but notice that AMD does offer a more attractive solution in terms of price, performance, and acoustics in the form of dual Radeon HD 6970 cards. You must really covet slot space—or have designs for a dual-6990, four-way CrossFireX rig—if you pick the 6990 over two 6970s. Not that there’s anything wrong with that.

We also can’t avoid noticing Nvidia still owns the title of the fastest dual-GPU solution on the market, in the form of two GeForce GTX 580s in SLI. And we have some clear indications that Nvidia may be cooking up an answer to the Radeon HD 6990 based on that same technology. The challenge Nvidia faces if it wants to dethrone the 6990 is, of course, power draw and the related cooling required. Given that two GTX 570s are slower than a single 6990 and draw more power, the GeForce team certainly has its work cut out for it. Besting the 6990 will have to involve some deep magic, or at least solid progress on multiple fronts.

Or, you know, a triple-slot cooler.

The post AMD’s Radeon HD 6990 Graphics Card appeared first on The Tech Report.

]]>
Full Review of AMD Radeon R9 380X Graphics Card https://techreport.com/review/amds-radeon-r9-380x-graphics-card-reviewed/ Mon, 30 May 2022 17:00:00 +0000 http://localhost/wordpress/amds-radeon-r9-380x-graphics-card-reviewed amd radeon r9 380x

Introduction This article is a review of the AMD Radeon R9 380x Graphics Cards. It includes tables, graphs, pictures, information, and game testing. This post is sponsored by DVwarehouse where...

The post Full Review of AMD Radeon R9 380X Graphics Card appeared first on The Tech Report.

]]>
amd radeon r9 380x

Introduction

This article is a review of the AMD Radeon R9 380x Graphics Cards. It includes tables, graphs, pictures, information, and game testing.

This post is sponsored by DVwarehouse where you can buy refurbished/used computer products for fantastic prices. See the great prices for refurbished apple desktop computers on their website. Buying DVWarehouse computer products helps support us at Techreport!

The Full Review

If you’re shopping for a graphics card in the $200 to $250 range these days, your choice mostly boils down to one question: 2GB or 4GB? Nvidia’s GeForce GTX 960 comes with 2GB of RAM to start, and fancier versions come with four gigs. AMD’s similarly priced Radeon R9 380 performs comparably and can also be had in 2GB and 4GB flavors. Simple enough.

AMD is shaking up that comfortable parallel today with the Radeon R9 380X. This card’s Tonga GPU has more resources enabled than in the familiar Radeon R9 380. On the 380X, all of Tonga’s 32 GCN compute units are turned on, for a total of 2048 shader processors. This card also packs 128 texels per clock of texture-filtering power, versus 112 in the plain 380. The Radeon R9 380X will come with 4GB of GDDR5 RAM clocked at 1425MHz for a theoretical bandwidth peak of 182 GB/s.


Sapphire’s Nitro Radeon R9 380X

Aside from those slightly more generous resource allocations, the R9 380X’s spec sheet looks much the same as the R9 380. This card maintains its counterpart’s 32-pixel-per-clock ROP throughput and 256-bit memory bus. Since Tonga is one of AMD’s newer GPUs, it also gives the R9 380X support for modern AMD features like FreeSync, TrueAudio, Virtual Super Resolution, and Frame Rate Target Control.

We’ve long suspected that a fully enabled Tonga would have a 384-bit memory interface, along with more ROP throughput (48 pixels per clock). In fact, several sources appear to have confirmed that fact. However, the 380X has “only” a 256-bit path to memory. We’re not complaining, though. The 380X’s price and likely performance look to be quite attractive, even if they’re not exactly what we’d expected. Tonga’s color compression capability ought to help wring the best possible performance out of the card’s available memory bandwidth.

Here’s a quick look at the R9 380X’s specs, bracketed by those of the Radeon R9 380 and R9 390 for easy comparison:

Base
clock
(MHz)
Boost
clock
(MHz)
ROP
pixels/
clock
Texels
filtered/
clock
Stream
pro-
cessors
Memory
path
(bits)
GDDR5
transfer
rate
Memory
size
Peak
power
draw
Price
(street)
R9 380 918 32 112 1792 256 5.5 GT/s 2GB/4GB 190W $210
R9 380X 970 32 128 2048 256 5.7 GT/s 4GB 190W $239
R9 390 1000 64 160 2560 512 6 GT/s 8GB 230W $319

Those are AMD’s reference specs, and as you can see, the 380X offers a little more juice than the R9 380 across the board.

What’s more, AMD’s board partners have already worked over the Radeon R9 380X with custom coolers and boosted clock speeds. Cards from those partners are the ones most builders will be using in their systems, and those cards are also the ones we’ll be using to test the Radeon R9 380X today.

 

Additional Arrivals

 

We already shown you Sapphire’s Nitro Radeon R9 380X above. This card comes with an eyebrow-raising 1040MHz GPU clock speed out of the box. The company also gooses the card’s memory clock to 1500MHz, for an effective speed of 6 GT/s. Sapphire’s card arrived in our labs first, so it’s the one we’ll use to represent the 380X’s performance in most of our tests.

Sapphire keeps the 380X’s Tonga chip cool with one of its attractive Dual-X heatsinks. This cooler’s twin ball-bearing fans can stop spinning under light loads for silent operation. From this angle, you can see this card’s numerous copper heat pipes, too.

It might be unusual to note for a graphics card, but the Nitro feels hefty and dense in the hand. That weightiness, and the copper on display, suggests a top-shelf cooler under the Nitro’s shroud. At about 9″ long, this card should be able to fit into most cases without a fuss, too.

Sapphire reinforces the Nitro 380X’s PCB with an attractively-finished aluminum backplate. The card draws power through twin six-pin PCIe power connectors. Sapphire tells us this Nitro 380X will carry a suggested price of $239.99. The company will also sell a reference-clocked model for $229.99.

Asus is also getting in on the R9 380X game, and it sent us one of its Strix R9 380X OC4G Gaming cards to put through the wringer. The Strix comes with a 1030MHz clock speed by default, and a setting in Asus’ included software utility can push the clocks all the way to 1050MHz.

This card’s brawny DirectCU II cooler carries heat away from the GPU with massive heat pipes that snake through an equally substantial fin array. Like the Sapphire card, the Strix can stop its fans at idle for silent running. Builders will want to double-check that their cases can swallow this card’s 10.5″ length without issue, though.

From the top down, we get a better look at this card’s attractive backplate, that enormous heat pipe, and the twin six-pin power connectors. You can’t see it in this picture, but Asus helpfully includes an LED near the power connectors that will glow red if you forget to plug in the required cables. The company also throws a one-year premium subscription for Xsplit Gamecaster in the box for the streamers out there.

The hot-clocked OC4G Gaming card seen above will carry a $259.99 suggested price. Asus will also offer a reference-clocked Strix R9 380X with the same cooler for $239.99.

No, you’re not experiencing deja vu. We’ve also included an Asus Strix Radeon R9 380 card in this review. This card will represent the slightly-less-powerful R9 380 on our bench today. For the unfamiliar, the R9 380 is essentially just a re-badged Radeon R9 285—only this one has 4GB of memory, versus the 2GB on most R9 285 cards.

From the outside, this card looks a lot like the Strix 380X. It’s got a lot of the same perks from its more muscular sibling, like the semi-silent cooler and the Xsplit subscription. This version of the 380 sells for $219.99 on Newegg right now, and Asus is offering a $20 rebate card to sweeten the deal.

 

The GeForce GTX 960 goes 4GB, too
Going by price, the most natural foil for the Radeon R9 380X in Nvidia’s lineup is the GeForce GTX 960. We already know and love the Gigabyte’s Windforce GTX 960 2GB from when we first reviewed that GPU, but card makers are now offering versions of the GTX 960 with 4GB of GDDR5 that are closer to the 380X’s sticker price. It seemed only logical to pick up the 4GB version of this card to represent the green team this time around.

This card goes for about $230 right now on Newegg. Larger memory size aside, the Windforce is practically identical to its 2GB cousin. This card gives us higher-than-reference 1216MHz base and 1279MHz boost clocks, and it keeps the hot-clocked GPU cool with a whisper-quiet twin-fan heatsink.

Game Testing the AMD Radeon R9 380x

Our testing methods

Most of the numbers you’ll see on the following pages were captured with Fraps, a software tool that can record the rendering time for each frame of animation. We sometimes use a tool called FCAT to capture exactly when each frame was delivered to the display, but that’s usually not necessary in order to get good data with single-GPU setups. We have, however, filtered our Fraps results using a three-frame moving average. This filter should account for the effect of the three-frame submission queue in Direct3D. If you see a frame time spike in our results, it’s likely a delay that would affect when the frame reaches the display.

We didn’t use Fraps with Ashes of the Singularity, Battlefield 4, or the Fable: Legends benchmark. Instead, we captured frame times directly from the game engines using the games’ built-in tools. We didn’t use our low-pass filter on those results.

As ever, we did our best to deliver clean benchmark numbers. Our test systems were configured like so:

Processor Core i7-5960X
Motherboard Gigabyte X99-UD5 WiFi
Chipset Intel X99
Memory size 16GB (4 DIMMs)
Memory type Corsair Vengeance LPX
DDR4 SDRAM at 2133 MT/s
Memory timings 15-15-15-36 1T
Hard drive Kingston SSDNow 310 960GB SATA
Power supply Corsair AX850
OS Windows 10 Pro

Here are the full specs of the cards we used in this review, along with their driver versions:

Driver revision GPU base
core clock
(MHz)
GPU boost
clock
(MHz)
Memory
clock
(MHz)
Memory
size
(MB)
MSI GeForce GTX 960 Gaming 2G GeForce 358.91 1216 1279 1753 2048
Asus Strix Radeon R9 380 4GB Catalyst 15.11.1 990 1425 4096
Gigabyte GeForce GTX 960 4GB GeForce 358.91 1216 1279 1753 4096
Sapphire Radeon R9 380X Catalyst 15.11.1 1040 1500 4096
MSI GeForce GTX 970 Gaming 4G GeForce 358.91 1114 1253 1753 4096
XFX Radeon R9 390 Catalyst 15.11.1 1015 1500 8192

Thanks to Intel, Corsair, Kingston, and Gigabyte for helping to outfit our test rigs with some of the finest hardware available. Our thanks to Sapphire, Asus, XFX, and MSI for providing the graphics cards we tested in this review, as well.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

 

Fallout 4

Let’s kick off our testing with a brief stint in Bethesda’s newest post-apocalyptic romp. Fallout 4 uses several of the latest rendering techniques to provide players with the prettiest Wasteland yet.


Looking over this plot of the frame times from a single test run, we can see that all of the cards here are generally delivering a smooth experience, especially considering the lack of large or frequent spikes to the range of 50 ms. Big spikes like that would correspond to a frame-rate drop below 20 FPS, which can translate to a noticeable slowdown during gameplay.

The slightly more copious serving of pixel-pushing resources in the R9 380X helps it edge out the GeForce GTX 960s and the Radeon R9 380 when it comes to average FPS numbers. The Radeon R9 390 and GeForce GTX 970 are in a whole different league, though. We’d expect nothing less of those considerably more muscular cards in this measure of potential performance.

As the plots above hinted, the R9 380X is off to a good start in our advanced 99th-percentile frame time measure.  Neither the GeForce GTX 960 cards nor the Radeon R9 380 can deliver frames quite as smoothly as the 380X. To be fair, all of the cards here are hanging pretty close to the 33.3-ms threshold, so they all render the majority of their frames at a rate of 30 FPS or better. Meanwhile, the Radeon R9 390 offers a smoother experience still, while the GeForce GTX 970 rules the roost by a wide margin.

Looking at the “tail” of the frame time distribution for Fallout 4 lets us see how these cards handle the toughest frames they’re tasked with. The R9 380X has a curious hump in the middle of its curve as we move toward the 99th percentile. Even so, its worst frames don’t fall much above the 30-ms mark, so gamers can expect decently smooth frame deliver from the card overall.

Our numbers thus far have already told this tale, but the R9 380 and the GeForce GTX 960s don’t manage quite as low a frame time curve as the 380X here.


In our measures of “badness,” none of the cards spend any time past the 50-ms threshold that tends to produce noticeable drops in smoothness. The 380X doesn’t spend any time beyond the 33.3-ms barrier, either. The R9 380 and the GTX 960s do have a few bad frames past the 33.3-ms mark that might drop the frame rate under 30 FPS briefly, though.

 

Call of Duty: Black Ops 3

 


With Black Ops 3, our frame time plots continue to show smooth performance from the GTX 960 4GB and the R9 380X. The plots from the GTX 960 2GB and the R9 380 get a little spikier than in Fallout 4, but both cards still manage to stay below the 50-ms mark.

The AMD R9 380X comfortably slots in between the GTX 960 4GB and the GeForce GTX 970 in average FPS, and it delivers lower 99th-percentile frame times than the GTX 960 cards and the R9 380 while doing it. Not bad.

Here’s a weird plot. That stairstep pattern you see in these curves isn’t a product of incorrect data. We think it’s actually an artifact of some kind of internal quantization going on in Black Ops 3‘s engine. Hm. We may have to investigate that phenomenon further. For now, though, you can see that the 380X’s frame time curve sits comfortably lower than the GTX 960s and the 380 once more, and even its worst frames aren’t too far above the crucial 33.3-ms threshold.


In our measures of “badness” for Black Ops 3, none of the cards spend any time past the troublesome 50-ms mark. The R9 380X and 4GB GTX 960 barely spend any time above our critical 33.3-ms mark, too. The Radeon R9 380 spends somewhat more time in this range, while the 2GB GTX 960 really struggles. Perhaps this is one game where the extra memory helps.

 

Battlefield 4


Battlefield 4 is the first game that gives the AMD R9 380X a bit of trouble. The 380X and the GTX 960s are right on top of one another in our average FPS measurements for BF4, but the Radeon is slightly worse off than the 960s in our 99th-percentile frame time metric.


The 380X only spends a tiny bit of time past the 50-ms mark, but it struggles for nearly a second with frames that take longer than 33.3 ms to produce. The Radeon R9 380 has a harder time still. The GTX 960 cards have much less trouble here. The GTX 960 4GB is a little worse off than its 2GB counterpart, but the difference is minuscule.

 

The Witcher 3


Here’s another game that trips up the 380 and 380X a bit. You can see a few spikes toward the 50-ms range from both cards. The GTX 960s don’t appear to have as much trouble with whatever the Radeons were chewing on.

As expected from those spikes, the R9 380X leads its class in the potential measure of average FPS, but it falls behind the GeForce cards in our 99th-percentile frame time metric.


In our badness measures, the lower-end Radeons spend a bit of time past the 50-ms mark, resulting in noticeable slowdowns we could feel while play-testing. They also deliver more than a few frames beyond the 33.3-ms threshold. That all adds up to a less smooth experience than the GeForce cards can deliver in this title.

 

Ashes of the Singularity

Although Ashes of the Singularity‘s built-in benchmark allows us to collect both DirectX 11 and DirectX 12 data, we’ve chosen to collect and crunch numbers in DirectX 12 mode only for the graphs below.


Here’s another test where the GTX 960 4GB and the R9 380X are neck-and-neck in both performance potential and delivered results. If you were hoping for a test that demonstrates the advantages of one card over another, keep reading. This ain’t it.


As the frame time plot above hinted at, Ashes of the Singularity may be the hardest game in our test suite for these cards to run smoothly. Save for the heavyweights, all of the cards tested spend a lot of time past the 50-ms mark, and even the 33.3-ms threshold is a major challenge with our test settings.

 

Fable: Legends
We ran the Fable: Legends DirectX 12 benchmark at its 1080p preset. For more information about this benchmark, take a look at our full Fable: Legends rundown.

 


The R9 380X has a better time of it in the Fable: Legends benchmark. Our frame time plot doesn’t reveal any major spikes in the 380X’s graph, and the card takes the top of its class in both potential and delivered performance.


Though the R9 380X delivers generally smooth performance in Fable: Legends, this benchmark appears to give the GeForce GTX 960 2GB a pretty hard time. That card spends quite a bit more time than we prefer to see past the 50-ms and 33.3-ms thresholds. The GTX 960 4GB is much closer to the rest of the pack.

 

Power consumption

Please note that our “under load” tests aren’t conducted in an absolute peak scenario. Instead, we have the cards running a real game, Crysis 3, in order to show us power draw with a more typical workload.

No great surprises here. The Radeon R9 380X needs about 50 watts more juice under load to do its thing than the Nvidia cards or the Radeon R9 380. Cards based on Nvidia’s Maxwell silicon continue to be paragons of power efficiency. Let’s see how the extra power used by the Radeons translates into noise and heat.

Noise levels

At idle, these cards are all about as quiet as can be. Any variations in noise levels at idle are likely attributable to changes in the noise floor in our testing environment, not the cards themselves. We’d expect any of them to be inaudible inside a PC case at idle.

Under load, Asus’ Strix R9 380X “beats” the Sapphire card by one decibel. Both cards hang right with the custom-cooled GTX 960s from MSI and Gigabyte, too, despite their bigger appetites for electricity and the corresponding increase in heat production that brings. At least the Asus and Sapphire cards won’t transmit that fact to your ears when they’re running all-out.

Load temperatures attest to the effectiveness of the aftermarket coolers on the Asus and Sapphire cards, too. All of the R9 380X and GTX 960 cards we tested are within a degree or two of one another under load, even with their factory-boosted clocks. Nothing to complain about here.

 

Final Conclusions and Closing Thoughts

As usual, we’ll wrap up our tests with a couple of value scatter plots. To make our graphs work, we’ve converted our 99th-percentile numbers into FPS. The best values tend toward the upper left of the plot, where performance is highest and prices are lowest. We use a geometric mean to limit the impact of any outliers on the final score.


And there you have it. Whether you measure the R9 380X by our preferred 99th-percentile method or in traditional average FPS, this card performs better than the GeForce GTX 960 4GB. It also costs slightly more. All told, that means these cards offer similar value propositions. As AMD hoped, the 380X delivers solid performance in the games we tested, even at 2560×1440 with considerable amounts of eye candy turned up.

It’s true that the Maxwell-based GeForce GTX 960 is more power-efficient than the R9 380X, but the aftermarket coolers on the Sapphire and Asus Radeons we tested are more than up to the task of keeping the underlying silicon cool without making any more noise than the competing GTX 960 cards. Builders might notice a bit more heat from their PCs with an R9 380X pushing pixels, but that’s mostly hair-splitting. The R9 380X is a very solid option for its price.

Cost of AMD Radeon r9 380x

 

The biggest problem for the R9 380X might be those two dots in the upper-right corner of our scatter plot. Some variants of the GeForce GTX 970 can be had for under $300 right now, and the R9 390 is right there with it. If you’ve been paying attention to our test results, you know that the extra cash buys you a lot more graphics card. Unless the expected $240 street price of an R9 380X is the absolute top of your budget, we think it’s worth saving up the cost of one or two PC games and getting into an R9 390 or a GTX 970.

The post Full Review of AMD Radeon R9 380X Graphics Card appeared first on The Tech Report.

]]>
Corsair HS70 wireless gaming headset reviewed https://techreport.com/review/corsairs-hs70-wireless-gaming-headset-reviewed/ Mon, 30 May 2022 15:00:00 +0000 http://localhost/wordpress/corsairs-hs70-wireless-gaming-headset-reviewed corsair hs70 wireless

Introduction This article is a review of the Corsair HS70 wireless gaming headset. A TechReport review of the Corsair HS70 Wireless Gaming Headset. It includes photos, information, audio samples, costs...

The post Corsair HS70 wireless gaming headset reviewed appeared first on The Tech Report.

]]>
corsair hs70 wireless

Introduction

This article is a review of the Corsair HS70 wireless gaming headset. A TechReport review of the Corsair HS70 Wireless Gaming Headset. It includes photos, information, audio samples, costs and more.

This post is sponsored by DVwarehouse where you can buy refurbished/used computer products for fantastic prices. See the great prices for refurbished apple desktop computers on their website. Buying DVWarehouse computer products helps support us at Techreport!

The Review of the Corsair HS70 Wireless

When we looked at the Corsair HS50 gaming headset last fall, we were initially concerned with the build quality concessions made to keep its price down, but we ultimately came away impressed with a headset that got the job done without breaking the bank. Now, Corsair is back with the Corsair HS70 wireless headset. Like a superhero after a continuity reset, Corsair’s budget headset has returned with a very slightly updated look. This is along with noteworthy new powers. The cord is gone, replaced with a wireless USB dongle, and there’s now 7.1 surround sound virtualization on tap.

Related Post: Review of Top Tech Gadgets of 2022 

Once again, Corsair is going for a premium-feeling headset with a relatively budget-minded sticker price of $90. Since this headset is so similar to the HS50, we’ll be digging for differences between the two. We’re seeing if the wireless connectivity and surround sound are worth the extra $40 Corsair is asking for the HS70.

Build quality

Unveiling an upgraded piece of hardware means keeping the features that work while trying to bring something fresh to justify a higher price tag. The build quality on the HS70 is all about not fixing what isn’t broken. In fact, the headset is all but identical to its predecessor. The only visible differences come from what isn’t there with this headset. If you cut off the cord, you’d have a hard time telling the HS70 apart from the HS50 in a lineup.

Both headsets feature the same strong metal band and earcup harnesses, as well as the same plastic casing. The headband uses well-cushioned leatherette with a quilted-stitch pattern. The ear cups seem to use the same material without the quilting. The actual differences between the HS70 and the HS50 are a Micro-USB charging port on the left earcup. This is where a cord would otherwise go. Plus, there is a power button on the right one. The mute button and microphone are in the same spots on the left ear, too.

That all means that while wearing them, the HS70 feels identical to the HS50. The headset offers enough clamping force to stay on without budging during normal use. The earcups swivel just enough to account for different head shapes. But they don’t include enough to sit flat on your collar when you take the headset off to get some air. The mic bends easily in and out of the way with a steel gooseneck-style boom arm and holds its position firmly.

Like the HS50, the HS70 looks good, too. However, while the mic is removable like with the HS50, the fact that the HS70 require a USB connection means you won’t usually be able to wear them outside the house. You could, however, plug them into your work computer. You can wear them for calls and music alike without getting a second look. Corsair went with a simpler color scheme this time, too. You can get the HS70 with white or black plastic. Alongside PCs, the PlayStation 4 is the only console supported by the HS70s. So Corsair has forgone the blue- and green-accented options offered by the HS50. So, they coordinate with the latest consoles from Sony and Microsoft.

 

Cutting the Cord with the Corsair HS70 Wireless Headset

The biggest difference in actually using the HS70s comes from the added wireless functionality. Corsair promises up to 40′ of range over the 2.4 GHz wireless band and up to 16 hours of battery life.

It’s a bit difficult to test the upper limits of that 40-foot promise, but I gave it a shot. I live in an apartment with a lot of walls, and I can get about 25 feet away from the USB dongle before the signal fails. The only way to get a steady signal at that distance is to stand as still as Drax eating Zargnuts in Avengers: Infinity War. That figure might not sound impressive, but keep in mind there are a few walls, some kitchen appliances, and interfering wireless signals between me and the dongle by that point. That range also closely matches that of other wireless headsets I’ve used. More importantly, I never experienced any drops or unsteady signal when using the HS70 within reach of my PC or PlayStation 4.

The battery life promise holds up pretty well, too. While I wasn’t able to time the drain down to the minute, I made it through three Monster Hunter World sessions on the PlayStation 4 and two Sea of Thieves sessions on my PC, each roughly two to four hours in length, before the headset really started crowing at me. That works out to a solid 14-plus hours on a single charge.

The identical build quality of the HS70s compared to the HS50s is mostly a blessing. This is a comfortable, attractive, and relatively light headset—but this would’ve been a good place for Corsair to surprise us a little. The HS70 headset costs 80% again as much as its predecessor, and a little extra effort in the materials or design would go a long way to justifying the price increase. For example, steel-clad or fully swiveling earcups would be welcome improvements. Even so, what’s here is good. It feels good to wear.

There’s a palpable downside to only having a wireless connection available, though. Since there’s no option to use a regular 3.5-mm analog jack to connect to a source, the HS70 can’t connect to the Xbox One, phones, tablets, or other devices with only analog outputs. You can use it with PCs and the PlayStation 4, and that’s it.

Audio and Mic Quality

If you dig into the spec sheet, you’ll see that aside from the wireless functionality, the hardware inside the HS70 appears to be identical to that of the HS50, as implied by matching impedance and frequency response of the 50-mm drivers and the same mic sensitivity rating. Therefore, the minute-to-minute experience with the HS70 is going to be quite similar to using the HS50s. Check out our original review for the details.

Just like with the HS50, music feels a little muted, with unimpressive-but-serviceable highs and an overall quiet presentation. The HS70s still won’t compare to music-focused headphones when used for that purpose, but they’ll get the job done if you’re the kind of person who uses headphones to listen to music instead of music to listen to headphones.

The most notable parts of the audio experience with the HS70s come from the compromises introduced by its wireless connectivity. When the headset is powered on but not pumping out audio, there’s a noticeable hiss. It’s quickly drowned out by most games, but if you’re going to listen to quiet jazz, as is part of my headphone testing regimen, you’ll likely notice it. Other wireless headsets I’ve tried have a hiss of some kind, but the one on the HS70 is much more audible than most.

Speaking of electrical noise, another issue I experienced with the HS70s surfaced when I tried to wear them while they were charging. On top of the regular hiss, the headset made a weird high-frequency electrical noise that ceased when the headphones were unplugged. To try and figure out the source of this problem, we tested with a couple of different cords and plugged the headset into the front and rear USB ports of a desktop PC, a 2015 MacBook Air, and a dedicated USB charging hub. The amount of noise varied, but it was always present.

We brought up this issue with Corsair, and the company advised us that this unwanted noise wasn’t expected behavior. The company sent us a second sample to rule out the chance that our unit had a fluke, but the hiss and interference while charging on our second sample were still present—just less prominent. To be fair, most users will probably not be jamming out and juicing up the HS70 at the same time, but freedom from interference seems like a basic box to tick with audio gear. It’s not that Corsair doesn’t know how to build a headset free of these issues, either: TR Editor-in-Chief Jeff Kampman didn’t report hiss or whine from his Void Wireless RGB cans when he tried them out in a similar manner.


Corsair’s latest CUE interface for headsets

Plugging the HS70s into a PlayStation 4 console will give you the same stereo audio as a regular wired headset. Plugging them into a PC and installing Corsair’s CUE software opens up two possibilities. Both are not available on the HS50. They include custom equalization (EQ) and virtual surround sound. Setting up EQ curves is pretty straightforward in CUE, and what you set there will largely be down to whatever you prefer. I give credit where credit’s due, though. Setting up EQ profiles in Corsair’s software is a few steps easier than with Logitech’s software right now. That makes using the HS70 much more appealing than the competition.


A sample of the HS70’s mic

The virtual surround sound functionality is something I still waver back and forth on. As a general concept, I’m rarely very impressed by it. The HS70 gets that job done, though, and in a long session with Doom the positioning felt accurate. Virtual surround still doesn’t feel very good to me. Yet, the HS70 is no different from any other virtual-surround headphones I’ve put on.

 

Conclusions of Corsair HS70 Wireless Headset

The main appeals of Corsair’s HS70 over its HS50 predecessor are from two elements. They include its wireless functionality and its surround-sound virtualization. Personally, I find that I rarely want to be more than an arm’s reach away from my console controller. Also, I don’t want to wear my headset more than a few feet from my computer. That might be because I own a pile of wired headphones big enough to worry my friends and loved ones. So, I’m used to being tethered to my listening device of choice. All told, the wireless connection might not mean as much to me as it might for others. I can take it or leave it.

What I can’t ignore is that the Corsair HS70 Wireless is more expensive than the HS50. Yet, it loses utility by going USB-only. The HS70 is compatible only with the PC and PlayStation peripherals as audio sources. However, the HS50 can plug into just about anything with a 3.5-mm audio jack, This includes Xbox One controllers, phones, and tablets. Given that the headset doesn’t actually sound any different than the more affordable HS50 and has to be fed from a charging source every now and again, the cord-cutting isn’t painless.

Costs

In exchange for the missing simplicity and flexibility, though, you’ll get a wireless headset with a fairly good range. This is along with solid battery life and the additional benefit of 7.1 virtual surround sound. The $90 asking price for these cans is less than what lots of other headsets out there go for. Wired headsets like Logitech’s G Pro can go for about the same price, for just one example (although it’s worth noting that headset makers can plow resources not spent on radios into other measures of quality). On the flip side, other wireless headsets like HyperX’s Cloud Flight demand a whopping $60 more than what the HS70s go for.

Keep in mind, though, that even Corsair’s own Void Pro RGB wireless headset goes for $100 as I write this. I haven’t had experience with it myself, but TR Editor-in-Chief Jeff Kampman has a pair. He notes that they fold flat on the neck when they’re at rest. And his pair doesn’t have the same issues with hiss or noise that the HS70s do (even across multiple samples). I prefer the look of the HS70, though, and I like the steel harnesses for the ear cups.

With all that in mind, whether the HS70s are right for you will ultimately come down to personal preference. The HS50s give you the same look and feel for almost half the price, while Corsair’s Void RGB headset seems to avoid the HS70’s pitfalls for just a few dollars more than this set. If you really like the look of the HS50s but just hate wires, the HS70s aren’t a bad option. But looks alone don’t make them a standout.

Related Post: Corsair Goes on Grand Tour with the warrior chair

The post Corsair HS70 wireless gaming headset reviewed appeared first on The Tech Report.

]]>
Full Review of AMD Radeon HD 7790 Graphics Card https://techreport.com/review/amds-radeon-hd-7790-graphics-card-reviewed/ Sun, 29 May 2022 17:07:00 +0000 http://localhost/wordpress/amds-radeon-hd-7790-graphics-card-reviewed amd radeon hd 7790 asus version

This article is a review of the AMD Radeon HD 7790 graphics card. It includes graphs, images, information, data, costs, and game testing. This post is sponsored by DVwarehouse where...

The post Full Review of AMD Radeon HD 7790 Graphics Card appeared first on The Tech Report.

]]>
amd radeon hd 7790 asus version

This article is a review of the AMD Radeon HD 7790 graphics card. It includes graphs, images, information, data, costs, and game testing.

This post is sponsored by DVwarehouse where you can buy refurbished/used computer products for fantastic prices. See the great prices for refurbished apple desktop computers on their website. Buying DVWarehouse computer products helps support us at Techreport!

Introduction

Some of us were expecting AMD to unleash next-generation Radeons this spring, but it was not to be. We learned last month that the company intends to keep its Radeon HD 7000 series around through much of 2013. A completely new product series is in the works, but it’s not due out until very late this year—likely just before Christmas.

As we hung our heads listening to the news, we learned that AMD’s plans didn’t preclude new releases long before the holidays. In fact, we were told that the Radeon HD 7000 series would soon be expanded with fresh cards featuring new silicon. We’d soon have some previously unseen hardware to sink our teeth into.

A New GPU

True to its word, AMD has now introduced the Radeon HD 7790, a $149 graphics card powered by a new GPU called Bonaire. This addition offers an interesting middle ground between the Radeon HD 7770 and the Radeon HD 7850, not to mention a potentially compelling alternative to Nvidia’s GeForce GTX 650 Ti.

The new Bonaire GPU

Buckle up, folks, because AMD’s code names get a little bumpy here. Bonaire is officially part of the Sea Islands product family. Sea Islands no longer implies a next-gen graphics architecture as it once did, however; in AMD’s words, the name now encompasses “all products we’re producing in 2013.” Bonaire, despite being a completely new ASIC, is actually based on the exact same Graphics Core Next graphics architecture as the Radeon HD 7000 series (which was itself code-named Southern Islands).

Bonaire also happens to be the name of an island in the planet’s northern hemisphere. And Northern Islands code name refers to the Radeon HD 6000 series. But I digress.

All this code-name mumbo jumbo aside, Bonaire is an exciting addition to AMD’s GPU lineup. While it features the same 128-bit memory interface and ROP arrangement as Cape Verde, the chip that powers the Radeon HD 7770, it has four more compute units and one additional geometry engine. That means the ALU count has gone up from 640 to 896, the number of textures filtered per clock has increased from 40 to 56, and the number of triangles rasterized per clock cycle has risen from one to two.

ROP
pixels/
clock
Texels
filtered/
clock
(int/fp16)
Shader
ALUs
Rasterized
triangles/
clock
Memory
interface
width (bits)
Estimated
transistor
count
(Millions)
Die
size
(mm²)
Fabrication
process node
Cape Verde 16 40/20 640 1 128 1500 123 28 nm
Bonaire 16 56/28 896 2 128 2080 160 28 nm
Pitcairn 32 80/40 1280 2 256 2800 212 28 nm
GF114 32 64/64 384 2 256 1950 360 40 nm
GK104 32 128/128 1536 4 256 3500 294 28 nm
GK106 24 80/80 960 3 192 2540 214 28 nm

Translation: Bonaire is rigged to offer higher floating-point math performance, more texturing capability, and better tessellation performance than Cape Verde. Also, as you’ll see on the next page, AMD equips Bonaire with substantially faster GDDR5 RAM, which gives it a bandwidth advantage despite its identical memory controller setup.

In addition to the different unit mix, Bonaire has learned a trick from Trinity and Richland, AMD’s mainstream APUs. That trick takes the form of a new Dynamic Power Management (DPM) microcontroller, which enables Bonaire to switch between voltage levels much quicker than Cape Verde or other members of the Southern Islands family. Behold:


The diagram above shows the different DPM states available to Bonaire. You can click the buttons under the image to switch between the first diagram, which shows Bonaire’s capabilities, and the second diagram, which shows the states available to a “Boost”-equipped version of Tahiti, as found in the Radeon HD 7970 GHz Edtion.

In Tahiti, there are four discrete DPM states, each with its own voltage and clock speed. The GPU can switch between clock speeds very rapidly—in as little as 5-10 ms—but voltage changes require “several hundred milliseconds.” In order to stay within its power and thermal limits at the High and Boost states, the chip attempts to reduce its clock speed without lowering the voltage level. AMD call these reductions “inferred” states. They enable the GPU to respond quickly to load increases in order to prevent power consumption from going over the limit. If lowering the clock speed isn’t enough, then the chip falls back to a lower discrete state, which involves a voltage cut—and therefore takes longer than a simple clock-speed adjustment.

That’s not a bad approach. However, it means the GPU may often find itself with more voltage than it needs to operate at a given clock speed. As a result, power consumption may be higher than it should be, while the core clock speed (and thus performance) might be lower than it needs to be.

How does Bonaire Improve the Balance

How does Bonaire improve on this formula? Well, it has a total of eight discrete DPM states, each with a different clock speed and voltage. Bonaire can switch between those states as quickly as every 10 milliseconds, which removes the need for the “inferred” states seen in Tahiti—that is, clock speed reductions without corresponding voltage cuts. This means the GPU can very quickly select the optimal clock speed and voltage combination to offer the best performance at the predefined power envelope.

Although it lacks support for the Boost power state, the Cape Verde chip in the Radeon HD 7770 otherwise behaves much like Tahiti, whose DNA it shares. Thus, the additional power states in Bonaire give the Radeon HD 7790 an advantage in power efficiency over the 7770.

Here are the cards:

The Radeon HD 7790 is slated to be available for purchase on April 2 at a suggested e-tail price of $149. At “participating retailers,” the card will be sold with a free copy of BioShock Infinite. Since the Radeon HD 7750 and 7770 were left out of the Never Settle Reloaded bundle, that’s good to know.

AMD sent us a list of some of the Radeon HD 7790 variants its partners have in store. Here it is:

Officially, the AMD Radeon HD 7790 is meant to run at an even 1GHz with 6Gbps memory. As you can tell from the list above, however, retail cards with above-reference clock speeds will be commonplace—possibly more so than standard designs. When briefing us about the 7790, AMD suggested that it gave partners an exceptional amount of leeway in designing their cards. The chipmaker also said we’ll see versions of the 7790 with 2GB of GDDR5 memory, up from the default 1GB. None of those 2GB models are in the list above, though, and we weren’t given a timetable for their arrival.

AMD eschewed sending us a reference board for our review. Instead, the company sent us Asus’ Radeon HD 7790 DirectCU II, which is one of the nicer offerings coming in April. Asus expects to sell it for around $155, but exact pricing hasn’t yet been set.

amd radeon hd 7790 asus

That big, heatpipe-laden dual-slot cooler almost dwarfs the stubby circuit board, which measures only 6.8″ in length. Cooler included, the card is about 8.5″ long. The DisplayPort, HDMI, and dual DVI outputs can drive up to six displays (provided you use a DisplayPort hub), and the card takes power from a single six-pin PCI Express connector. (Note that the PCIe connector is rotated, so that the clip faces the back of the circuit board. If it were in the usual position, the heatsink fins would be in the way.)

Unfortunately, we didn’t get the DirectCU II until Tuesday, which gave us too little time to benchmark it. By then, we’d already started testing Sapphire’s Radeon HD 7790, which FedEx delivered the day before.

The Sapphire card has the same 1075MHz core speed and 6.4Gbps memory speed as the Asus. It features a similar dual-fan cooler, albeit without conspicuous heat pipes, and it has the same 8.5″ overall length and display output arrangement. The circuit board spans the whole length of the cooler, however, and the PCI Express power connector sits at the top of the card with the clip facing the front—a pretty common arrangement.

A farewell to the Radeon HD 7850 1GB

AMD’s Radeon HD 7850 1GB came out in October. Since its launch, the card has wooed value-conscious gamers by delivering much of the performance of its 2GB namesake at a lower price—often as little as $160. We’ve recommend it in several of our system guides.

Now, sadly, the 7850 1GB is about to disappear from retail listings forever. The Radeon HD 7790 will be its de facto successor.

According to AMD, the 7850 1GB is going away because memory makers have stopped producing the 128MB GDDR5 chips it requires. The card has four 64-bit dual-channel memory controllers that must each be fed by two memory chips; it therefore needs eight 128MB chips to achieve a 1GB capacity. The 7790 doesn’t have that problem. With only two 64-bit memory controllers, it can deliver the same 1GB capacity using larger, 256MB GDDR5 chips, which are still being made.

This disappearing act gives the 7790 some pretty big shoes to fill. GPUs with 128-bit memory interfaces don’t often match the performance of their 256-bit siblings, especially when they’re based on the same architecture. If the 7790 fails to deliver, folks could be forced to splurge for a Radeon HD 7850 2GB, which would set them back at least $180.

Comparing the AMD Radeon HD 7790 

The Radeon HD 7790 won’t just be trying to live up to the 7850 1GB’s legacy. It will also face competition from higher-clocked versions of Nvidia’s GeForce GTX 650 Ti, which are available in the same price range, as well as some of the old GeForce GTX 560 cards that remain on the market. We’ll look at real-world benchmarks very soon, but before we do, let’s take a quick look at theoretical numbers. The table below includes peak rates for both reference cards and the souped-up variants we’ve got in our labs.

Base
clock
(MHz)
Boost
clock
(MHz)
Peak
ROP rate
(Gpix/s)
Texture
filtering
int8/fp16
(Gtex/s)
Polygon
throughput
(Mtris/s)
Peak
shader
tflops
Memory
transfer
rate (GT/s)
Memory
bandwidth
(GB/s)
Radeon HD 7770 1000 N/A 16 40/20 1000 1.3 4.5 72
Radeon HD 7790 1000 N/A 16 56/28 2000 1.8 6.0 96
Sapphire Radeon HD 7790 1075 N/A 17 60/30 2150 1.9 6.4 102
Radeon HD 7850 1GB 860 N/A 28 55/28 1720 1.8 4.8 154
GeForce GTX 650 Ti 928 N/A 15 59/59 1856 1.4 5.4 86
Zotac GeForce GTX 650 Ti 2GB AMP! 1033 N/A 17 66/66 2066 1.6 6.2 99
GeForce GTX 560 810 N/A 26 45/45 1620 1.1 4.0 128
MSI GeForce GTX 560 Twin Frozr II 870 N/A 28 49/49 1760 1.2 4.2 134

Compared to the Radeon HD 7850 1GB, the 7790 in theory has similar texture filtering and shader performance, and it should offer even higher tessellation throughput. However, the 7790 has only three fifths the ROP rate, which means less resolve power for multisampled anti-aliasing, and two thirds the memory bandwidth. Those limitations may or may not affect real-world gaming performance, depending on the nature of the graphics workload.

The 7790 is more comparable to the GTX 650 Ti. On paper, these two cards have roughly equivalent ROP rates, polygon throughput, and memory bandwidth. The 7790 enjoys an advantage in shader throughput, while the 650 Ti promises better texture filtering performance, especially for fp16 texture formats. This contest is probably too close to call at this stage.

As for the old GTX 560, that card has the same advantages as the Radeon HD 7850 1GB—higher memory bandwidth and ROP rates—but it trails the 7790 in key rates like texture filtering, shader arithmetic, and polygon rasterization. The 7790 may come out ahead more often than not in newer games, especially those that use shader-based antialiasing techniques instead of MSAA.

A quick word about our guinea pigs

We had an unusually short time window to review the Radeon HD 7790, and AMD didn’t reveal the 7790’s pricing until Tuesday evening. We did our best to estimate the card’s positioning and obtain a comparable GeForce GTX 650 Ti from Nvidia’s graphics card partners, but we were unable to get one in time.

The card you’ll see tested alongside the 7790 over the next few pages is a Zotac AMP! Edition offering, which has 2GB of onboard memory and somewhat higher clock speeds than most other GTX 650 Ti variants. It currently retails for $181 at Newegg, or about $20 more than what Sapphire expects to charge for its Radeon HD 7790 at launch.

Now, there’s nothing particularly wrong with comparing these two cards. They’re both genuine retail offerings, and the performance comparison should be enlightening. That said, we’d ask that you please keep the price difference in mind as you peruse our benchmarks. GTX 650 Ti variants priced around the $160 mark are likely to be a little slower than our sample. Also, please stay tuned. Very soon, we’ll have another article with more benchmarks that include another version of the GTX 650 Ti.

We were, however, able to get a new model of the Radeon HD 7770 GHz Edition in time for the review: Diamond’s version of the card, which is a good representative of vanilla offerings available out there. It runs at the reference 1000MHz core and 4500MT/s memory speeds, and it has a stubby dual-slot cooler with a large, quiet fan. This seems to be a stubbier version of the model selling at Newegg for $135.99 (before a $20 mail-in rebate) right now.

Game Testing the AMD Radeon HD 7790

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and we reported the median results. Our test systems were configured like so:

Processor Intel Core i7-3770K
Motherboard Gigabyte Z77X-UD3H
North bridge Intel Z77 Express
South bridge
Memory size 4GB (2 DIMMs)
Memory type AMD Memory
DDR3 SDRAM at 1600MHz
Memory timings 9-9-9-28
Chipset drivers INF update 9.3.0.1021
Rapid Storage Technology 11.6
Audio Integrated Via audio
with 6.0.01.10800 drivers
Hard drive Crucial m4 256GB
Power supply Corsair HX750W 750W
OS Windows 8 Professional x64 Edition
Driver revision GPU base
clock
(MHz)
Memory
clock
(MHz)
Memory
size
(MB)
Diamond Radeon HD 7770 Catalyst 12.101.2.1000 beta 1000 4500 1GB
Sapphire Radeon HD 7790 Catalyst 12.101.2.1000 beta 1075 6000 1GB
XFX Radeon HD 7850 1GB Core Edition Catalyst 12.101.2.1000 beta 860 1200 1GB
MSI GeForce GTX 560 Twin Frozr II GeForce 314.21 beta 880 1050 1GB
Zotac GeForce GTX 650 Ti AMP! GeForce 314.21 beta 1033 1550 2GB

Thanks to AMD, Corsair, and Crucial for helping to outfit our test rig. Asus, Diamond, MSI, Sapphire, XFX, and Zotac have our gratitude, as well, for supplying the various graphics cards we tested.

Image quality settings for the graphics cards were left at the control panel defaults, except on the Radeon cards, where surface format optimizations were disabled and the tessellation mode was set to “use application settings.” Vertical refresh sync (vsync) was disabled for all tests.

We used the following test applications:

Some further notes on our methods:

  • We used the Fraps utility to record frame rates while playing a 90-second sequence from the game. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We tested each Fraps sequence five times per video card in order to counteract any variability. We’ve included frame-by-frame results from Fraps for each game, and in those plots, you’re seeing the results from a single, representative pass through the test sequence.
  • We measured total system power consumption at the wall socket using a P3 Kill A Watt digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Skyrim at its High quality preset.
  • We measured noise levels on our test system, sitting on an open test bench, using a TES-52 digital sound level meter. The meter was held approximately 8″ from the test system at a height even with the top of the video card.You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.
  • We used GPU-Z to log GPU temperatures during our load testing.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Synthetic testing

Yes, yes, I know you’re dying to get into game benchmarks. However, synthetic benchmarks set the stage for everything else, helping to demonstrate how well the theoretical peak throughput numbers we’ve discussed translate into delivered performance.

Texture filtering

Peak bilinear
filtering
(Gtexels/s)
Peak bilinear
FP16 filtering
(Gtexels/s)
Memory
bandwidth
(GB/s)
Radeon HD 7770 40 20 72
Sapphire Radeon HD 7790 56 28 102
Radeon HD 7850 1GB 55 28 154
MSI GeForce GTX 560 Twin Frozr II 49 49 134
Zotac GeForce GTX 650 Ti 2GB AMP! 66 66 99

In the real world, memory bandwidth plays a part in texturing performance. That probably explains why the Radeon HD 7790 falls behind the 7850 1GB here.

Tessellation

Peak
rasterization
rate
(Mtris/s)
Memory
bandwidth
(GB/s)
Radeon HD 7770 1000 72
Sapphire Radeon HD 7790 2150 102
Radeon HD 7850 1GB 1720 154
MSI GeForce GTX 560 Twin Frozr II 1760 134
Zotac GeForce GTX 650 Ti 2GB AMP! 2066 99

Wow. The 7790’s two geometry engines do wonders for tessellation performance, especially when paired with a 1075MHz core clock speed, as on the Sapphire card.

Shader performance

Peak shader
arithmetic
(TFLOPS)
Memory
bandwidth
(GB/s)
Radeon HD 7770 1.3 72
Sapphire Radeon HD 7790 1.9 102
Radeon HD 7850 1GB 1.8 154
MSI GeForce GTX 560 Twin Frozr II 1.2 134
Zotac GeForce GTX 650 Ti 2GB AMP! 1.6 99

The 7790 comes out on top here.

We’ll have to stop our theoretical explorations here, unfortunately. I normally include LuxMark, an OpenCL-accelerated ray-tracing benchmark, in this set, but it refused to run using the drivers AMD provided for the 7790. Oh well; moving on…

Tomb Raider

Developed by Crystal Dynamics, this reboot of the famous franchise features a more believable Lara Croft who, as the game progresses, sheds her fear and vulnerability to become a formidable killing machine. I tested Tomb Raider by running around a small mountain area, which is roughly 10% of the way into the single-player campaign.

This is a rather impressive-looking game that’s clearly designed to take full advantage of high-end gaming PCs. The Ultra and Ultimate detail presets were too hard on these cards, so I had to settle for the High preset and leave the game’s TressFX hair physics disabled. Testing was done at 1080p.

Frame time
(ms)
FPS rate
8.3 120
16.7 60
20 50
25 40
33.3 30
50 20

Let’s preface the results below with a little primer on our testing methodology. Along with measuring average frames per second, we delve inside the second to look at frame rendering times. Studying the time taken to render each frame gives us a better sense of playability, because it highlights issues like stuttering that can occur—and be felt by the player—within the span of one second. Charting frame times shows these issues clear as day, while charting average frames per second obscures them.

To get a sense of how frame times correspond to FPS rates, check the table on the right.

We’re going to start by charting frame times over the totality of a representative run for each system. (That run is usually the middle one out of the five we ran for each card.) These plots should give us an at-a-glance impression of overall playability, warts and all. You can click the buttons below the graph to compare our protagonist to its different competitors.


Right away, it’s clear that the Radeon HD 7790 is much closer to the 7850 1GB than to the 7770, whose plot shows frequent spikes above 30 ms. However, the 7790’s plot is still a little higher than that of the 7850 and the more expensive GTX 650 Ti 2GB AMP! Edition, which suggests that it’s not quite as fast.

We can slice and dice our raw frame-time data in several ways to show different facets of the performance picture. Let’s start with something we’re all familiar with: average frames per second. Average FPS is widely used, but it has some serious limitation. Another way to summarize performance is to consider the threshold below which 99% of frames are rendered, which offers a sense of overall frame latency, excluding fringe cases. (The lower the threshold, the more fluid the game.)

The average FPS and 99th-percentile results confirm our appraisal of the frame time plots. However, the performance difference between the 7790 and its faster rivals isn’t that big, especially in the 99th-percentile metric, which gives us a better indication of seat-of-the-pants smoothness and playability than average FPS.

Now, the 99th percentile result only captures a single point along the latency curve, but we can show you that whole curve, as well. With single-GPU configs like these, the right hand-side of the graph—and especially the last 5% or so—is where you’ll want to look. That section tends to be where the best and worst solutions diverge.

Finally, we can rank the cards based on how long they spent working on frames that took longer than a certain number of milliseconds to render. Simply put, this metric is a measure of “badness.” It tells us about the scope of delays in frame delivery during the test scenario. Here, you can click the buttons below the graph to switch between different milisecond thresholds.


None of the cards spend much time beyond our most important threshold of “badness” at 50 milliseconds—that means none of them dip below the relatively slow frame production rate of 20 FPS for long. In fact, except for the Radeon HD 7770, none of our cards spend a significant amount of time working on frames that take longer than 33.3 ms to render. That should mean pretty fluid gameplay from each of them.

Crysis 3

Yep. This is the new Crysis game. There’s not much else to say, except that this title has truly spectacular graphics. To test it, I ran from weapon cache to weapon cache at the beginning of the Welcome to the Jungle level for 60 seconds per run.

I tested at 1080p using the medium detail preset with high textures and medium SMAA antialiasing.


We’re seeing a bit more variability in frame times here than we did in Tomb Raider. Variability by itself isn’t necessarily bad; it’s frame time spikes that truly impair gameplay. Except for the 7770, which struggles at these settings, the cards we tested have surprisingly similar plots.

The 7790 may not have the highest average frame rate, but its 99th-percentile frame times are lower than those of all the other cards. Given the choice, we’d pick the 7790 over the cards with higher FPS averages.

Frame times for the GeForces and the 7850 1GB start spiking around the 95th percentile, but the 7790 holds largely steady up until the 97th or 98th percentile. In other word, it stays smoother throughout a larger chunk of the run. (The 7770 shows a similar progression to the 7790, but its frame times are far higher on average. In practice, it feels very sluggish and choppy in this game.)


If the percentile line graph above didn’t make it clear, this will. The Radeon HD 7790 spends a negligible amount of time working on frames that take longer than 50 ms to render, and it also spends less time beyond 33.3 ms than the other cards. Fewer spikes, smoother gameplay.

The 16.7-ms graph doesn’t show the 7790 in as positive a light, but none of these cards are quick enough for that metric to matter very much. Even the 7850 1GB spends almost nine full seconds, or about one seventh of the run, above that threshold. (For reference, a 16.7 ms frame time works out to a 60 FPS frame rate.)

Borderlands 2

For this test, I shamelessly stole Scott’s Borderlands 2 character and aped the gameplay session he used to benchmark the Radeon HD 7950 and GeForce GTX 660 Ti. The session takes place at the start of the “Opportunity” level. As Scott noted, this section isn’t precisely repeatable, because enemies don’t always spawn in the same spots or attack in the same way. We tested five times per GPU and tried to keep to the same path through the level, however, which should help compensate for variability.

I tested at 1920×1080. All other graphics settings were maxed out except for hardware-accelerated PhysX, which isn’t supported on the Radeons.


The Radeon HD 7790 does a much better job of keeping frame times steady than the other Radeons in this game. In fact, the other Radeons don’t look like they’re benefiting from the Borderlands 2 latency optimizations AMD first rolled out in the Catalyst 13.2 beta. Perhaps the beta driver AMD sent us with the 7790 doesn’t include those optimizations for other cards, somehow, or maybe AMD’s optimizations somehow don’t apply to the 7850 and 7770. We’ve asked AMD to clarify and are awaiting a response.


In any event, the 7790 looks to be about neck-and-neck with the pricier GeForce GTX 650 Ti AMP! Edition here. Not a bad showing at all.

Sleeping Dogs

I haven’t had a chance to get very far into Sleeping Dogs myself, but TR’s Geoff Gasior did, and he got hooked. From the small glimpse I’ve received of the game’s open-world environment and martial-arts-style combat, I think I can see why.

The game’s version of Hong Kong seems to be its most demanding area from a performance standpoint, so that’s what I benchmarked. I took Wei Shen on a motorcyle joyride through the city, trying my best to remember I was supposed to ride on the left side of the street.

I benchmarked Sleeping Dogs at 1920×1080 using a tweaked version of the “High” quality preset, with vsync disabled and SSAO bumped down to “Normal.” The high-resolution texture pack was installed, too.


Again, we have a nice, smooth plot for the Radeon HD 7790, and spiky plots for the other Radeons. Hmm. Whatever AMD’s doing, the 7790 performs very well, displaying even fewer spikes than the GTX 650 Ti 2GB AMP! and the GTX 560.


Yep. The 7790 hits a home run here.

The Elder Scrolls V: Skyrim

Here, too, I borrowed Scott’s test run, which involves a walk through the moor not far from the town of Whiterun—and perilously close to a camp of Giants.

The game was run at 1920×1080 using the “Ultra” detail preset. The high-resolution texture pack was installed, as well.


The 7770 and 7850 1GB fare poorly here, too, even though AMD addressed frame latency spikes in Skyrim in recent Catalyst beta drivers. By contrast, the 7790 appears to perform better; its plot has fewer, smaller frame time spikes than its fellow Radeons’ plots. Odd.


Although it has a higher FPS average, the 7790 generally trails the GTX 650 Ti AMP! in Skyrim. It fares worse in the 99th percentile frame time, and it spends more time beyond our 50- and 33-ms thresholds.

Battlefield 3

I tested Battlefield 3 by playing through the start of the Kaffarov mission, right after the player lands. Our 90-second runs involved walking through the woods and getting into a firefight with a group of hostiles, who fired and lobbed grenades at us.

I kept things simple, using the game’s “High” detail preset at 1080p.


The 7790 shadows both the 7850 1GB and GTX 650 Ti 2GB AMP! Edition in these frame-by-frame plots.

Our average FPS and percentile results confirm our initial observation. The 7790, 7850 1GB, and GTX 650 2GB AMP! are all neck-and-neck.


The 7790 pulls ahead ever so slightly in the “time spent beyond 33.3 ms” graph, but not by much. These three top contenders are about equally playable in Battlefield 3.

Power consumption

The Radeon HD 7790 actually draws a touch more power than the 7850 1GB at idle. However, it’s substantially more power-efficient under load, where it doesn’t consume much more than our reference-clocked Radeon HD 7770.

Noise levels and GPU temperatures

Sapphire’s dual-fan cooler keeps the 7790 both quiet and very cool.

A note about our noise levels: I live on the eighth floor of a tall building, and it was unusually windy both times I tried to take noise readings for this review. I attempted to alleviate the problem by taking the lowest reading from a one-minute recording for each card at each setting, so occasional wind gusts shouldn’t have impacted the numbers substantially. These cards really are all very quiet—except for the 7850, whose cooler whines a little more than the others under load. (I also heard a faint mechanical chirping from the 7850 and the GTX 650 that wasn’t present on the 7790.)

Final Conclusions and Thoughts on AMD Radeon HD 7790

Let’s wrap things up with a couple of our trademark value scatter plots. In both plots, the performance numbers are geometric means of data points from all games tested. (They exclude the synthetic tests at the beginning of the article.) The first plot shows 99th-percentile frame times converted into FPS for easier reading; the second plot shows simple FPS averages.

Prices for the GTX 650 Ti, 7850 1GB, and 7770 were taken from the Newegg listings for the cards we tested. The GTX 560’s price was taken from a Newegg listing for a comparable offering that’s still available, while the 7790’s price was taken from Sapphire.

The best deals should reside near the top left of each plot, where performance is high and pricing is low. Conversely, the least desirable offerings should be near the bottom right.


Well, well. Despite being thrown into the ring with a more expensive GeForce GTX 650 Ti card with twice as much memory, the Radeon HD 7790 more than holds its own overall. In fact, it’s quicker on average according to our 99th-percentile plot, which we think offers the best summation of real-world performance. The 7790 is negligibly slower in the average FPS plot—but it’s still a better deal considering the lower price.

Based on these numbers, I’d expect the 7790 to perform even better compared to a lower-clocked, like-priced version of the GeForce GTX 650 Ti. We’ll have to run the numbers to be sure, but this is hardly an outlandish extrapolation to make.

The 7790 also manages to outdo the 7850 1GB overall, proving its worth as a successor to that product. Sure, as the tests on the previous pages show, the 7790 doesn’t always outmatch its predecessor. Nevertheless, the fact that it does so overall should certainly be of some comfort to those saddened by the 7850 1GB’s departure.

Add to that the Radeon HD 7790’s power efficiency, its low noise levels, and the free copy of BioShock Infinite in the box, and it looks like we have a winning recipe from AMD.

Of course, Nvidia isn’t sitting still, and the firm may not be willing to take this new onslaught lying down. We’ve been hearing rumors that Nvidia will soon unleash a new card that could land in this exact same price range. Things may be about to get even more interesting.

Related Post: AMD’s Radeon HD 6990 Graphics Card Reviewed

The post Full Review of AMD Radeon HD 7790 Graphics Card appeared first on The Tech Report.

]]>
Espresso Machines that You Can Set Up in the Comfort of Home https://techreport.com/gadget-digest/espresso-machines-that-you-can-set-up-in-the-comfort-of-home/ Wed, 25 May 2022 19:55:25 +0000 https://techreport.com/?p=3477574 Espresso Machines that You Can Set Up in the Comfort of Home

With the craze of the pandemic, many things became vastly different for people worldwide. No attending sporting events or concerts. No going into the office or school. And unfortunately, no...

The post Espresso Machines that You Can Set Up in the Comfort of Home appeared first on The Tech Report.

]]>
Espresso Machines that You Can Set Up in the Comfort of Home

With the craze of the pandemic, many things became vastly different for people worldwide. No attending sporting events or concerts. No going into the office or school. And unfortunately, no more trips to your local coffee shop. However, while in-person commerce came to a halt, innovation did not. We as a society adapted and learned how to conduct business and find pleasure in new ways. And thanks to at-home espresso machines, we found a new way to enjoy our coffee without leaving the comfort of home.

At-home espresso machines are not an entirely new innovation, but they became much more essential with COVID-19. As customers found new ways to get their caffeine due to shops everywhere being closed, espresso machines became much more popular. In this article, we’ll take a look at some espresso machines changing our home lives for the better.

Breville Bambino Plus

First on our list of at-home espresso machines is the Bambino Plus, from Breville. Made of stainless steel with a 64oz capacity, the Bambino Plus is truly top of the line. Customization options allow customers total control over their espresso. Users adjust the temperature, texture, and length of the espresso shot to optimize their experience. The Bambino Plus also includes an automatic milk frother to ensure the perfect amount of foam. The espresso machine also features an especially impressive three-second heating time to ensure you can get your morning caffeine in no time at all. With a price of $499.95, the Breville Bambino Plus is one of the most consistent options on the market.

Rancilio Silvia

Next is the Rancilio Silvia espresso machine. This brand is made of stainless steel and features commercial quality parts. With a 67oz water tank and premier parts, customers can get an authentic espresso experience at home. The Rancilio Silvia also comes with a top-end steam wand to help further customize your experience. The machine doesn’t have some of the automatic features of other espresso machines, so it is truly meant for those who want an authentic coffee-making experience. The machine also takes around three minutes to heat, so it doesn’t offer quite the same convenience as other models. However, if you are interested in an authentic espresso experience and top-of-the-line quality, look no further.

Jura A1 Piano

Coming in at a price of $799, the Jura A1 Piano is a high-end espresso machine. Made of plastic with a 37oz container, it might be hard to see why the machine is so expensive at first. However, the device has a number of features to maximize your convenience, including a grinder and removable water tank. While it doesn’t have a steam wand or automatic frother, the Jura A1 allows users maximum control over their coffee distribution. Furthermore, the Jura A1 Piano can complete its cycle in around 30 seconds. With a sleek design, convenient features, and outstanding customization options, the Jura A1 Piano is a great at-home espresso option.

Nespresso CitiZ & Milk

With a price of $319, the Nespresso CitiZ & Milk is the first affordable option on our list. It is also the first on our list to use capsules. Made of steel and with a 33oz tank, it is one of the smaller options on the list as well. The Nespresso CitiZ is extremely user-friendly and requires very little prior knowledge on the consumer’s end. The machine comes with a milk frother on the side, and a few settings to customize your brew. The largest issue with this machine is that capsules have to be purchased repeatedly and aren’t exactly cheap. But, if you are a coffee fanatic looking for an easy-to-use machine, then the Nespresso CitiZ & Milk may be perfect for you.

DeLonghi Stilosa Espresso Machine EC260

Our final option for at-home espresso machines is also the cheapest. The DeLonghi Stilosa Espresso Machine EC260 comes with a price of just $99.95. Featuring a design with part plastic and part stainless steel, this machine has a sleek look. It comes with a 33.8 oz tank and a built-in milk frother. Users can adjust the size of their pour to best suit their needs for each day. The only real downside to this product is that it is entirely manual from the grinding to the measuring. But, if you are someone looking for great coffee and an affordable price, then this may be the perfect machine for you. It lacks some of the advanced features of other machines but is both dependable and durable. Plus, the removable tank makes it much easier to clean.

The post Espresso Machines that You Can Set Up in the Comfort of Home appeared first on The Tech Report.

]]>
Tech-Filled Golf Clubs: Take Your Game to the Next Level https://techreport.com/review/tech-filled-golf-clubs/ Tue, 24 May 2022 20:10:12 +0000 https://techreport.com/?p=3477565 With summer almost here, it's time to get back on the course and tee off. Take a look at some tech-fitted golf clubs to improve your game.

With the pandemic, people spent a lot of valuable time and money on their hobbies. While much of the outside world was at a standstill, the tech world was still...

The post Tech-Filled Golf Clubs: Take Your Game to the Next Level appeared first on The Tech Report.

]]>
With summer almost here, it's time to get back on the course and tee off. Take a look at some tech-fitted golf clubs to improve your game.

With the pandemic, people spent a lot of valuable time and money on their hobbies. While much of the outside world was at a standstill, the tech world was still innovating. New gadgets and technologies are now a part of our everyday lives. They’re even a part of our golf outings. Whether you are a professional or a casual player, tech-filled golf clubs can take your game to the next level.

Thankfully, with the pandemic drawing to a close, people are getting back out into the world. Likewise, children are back in school, employees are back in the office, and golfers are back on the course. Therefore, here are some of our top picks for tech-filled golf clubs to elevate your golfing.

TaylorMade P790 Irons

One of the top names in golf, the TaylorMade P790 series is a great way to upgrade your iron game. One of the most technologically packed clubs on the market, the P790 is loaded with innovations to boost your play.

The irons are fitted with intelligent sweet-spot technology and tungsten in the clubhead. Countless hours of testing and research went into creating a club that has a larger sweet spot with more forgiveness. As a result, this makes the P790 irons perfect for improving golfers.

Along with the upgrades to the clubface technology, the TaylorMade irons offer a number of other impressive features. The clubs implement a new carbon steel material that allows for a thinner headwall. Coupling this new thin wall tech with TaylorMade’s SpeedFoam Air technology allows the irons to be ridiculously lightweight. Consequently, with the massive reductions in weight, the clubs can reach higher ball speeds and better launch angles with ease.

One of the top tech-filled golf clubs on the market, the TaylorMade P790 irons are perfect for the golfer looking to up their game.

Callaway Rogue ST Pro Irons

Next on our list, the Callaway Rogue ST Pro irons are ideal for low-mid handicap golfers. Similar to the TaylorMade P790 series, the Callaway irons now feature a weighted tungsten club head. The tungsten helps to provide better launch conditions and faster speeds for your swing.

Similarly, the new hollow-body design and urethane microspheres mean your swing can be quicker and purer than ever before. The new design even helps to give you a better sound as the ball comes off the sweet spot.

Callaway is even taking advantage of AI software to help deliver the best performance possible. Callaway offers AI face optimization for their Rogue ST Pro Irons so that customers can get more consistent ball spin off the clubface. This AI face optimization, along with Callaway’s 450 steel, delivers higher ball speeds at a more consistent rate.

The only potential area of concern with the Callaway Rogue ST Pro Irons is in the “pureness” of the club. They don’t offer as much forgiveness as some other irons, which may cause issues for golfers with higher handicaps.

Cobra LTDX Driver

Shifting over to the big sticks, our first tech-filled driver is the Cobra LTDX Driver. Cobra is one of the top names in golf, and their newest driver makes it easy to see why.

Using their trademark PWR-COR technology, the Cobra LTDX Driver optimally positions weight to reduce spin and increase ball speed. And similar to the irons on this list, the LTDX incorporates tungsten material to make their club much more forgiving.

Along with the new materials, the LTDX also offers users a new face design that enables a larger sweet spot. As a result, with top-of-the-line materials and research-driven designs, the LTDX delivers high ball speeds and a forgiving clubface, making it the perfect tech-filled golf club to upgrade your game.

 TaylorMade Stealth Driver

The final tech-filled golf club to make our list comes once again from the folks at TaylorMade. The TaylorMade Stealth Driver is easily one of the top drivers available on the market.

As with all TaylorMade drivers, the Stealth Driver is extremely sleek and comes with a number of customization options. Consequently, users can fit themselves and experiment with a variety of options to optimize the club to their game.

However, where the TaylorMade Stealth Driver makes its name is in its new Carbonwood technology. The driver face has 60 layers of precisely placed carbon to maximize the speed of the ball off the club. Then a nanotexture cover is put in place to help optimize ball spin and launch angles.

Because the new technology significantly reduces weight, the driver is also much more forgiving than the old titanium models. Furthermore, the new carbon technology provides an absolutely beautiful sound as the ball comes off the clubface. With a sleek, lightweight design, and significant improvements to technology, the Stealth Driver is one of the best options available.

The post Tech-Filled Golf Clubs: Take Your Game to the Next Level appeared first on The Tech Report.

]]>
Full Review of Nvidia GeForce GTX 590 https://techreport.com/review/nvidias-geforce-gtx-590-graphics-card/ Tue, 24 May 2022 15:00:00 +0000 http://localhost/wordpress/nvidias-geforce-gtx-590-graphics-card nvdia geforce gtx 590

Introduction This article is a review of Nvidia Geforce GTX 590. It includes photos, graphs, information, comparisons, game tests and recommendations. This post is sponsored by DVwarehouse where you can...

The post Full Review of Nvidia GeForce GTX 590 appeared first on The Tech Report.

]]>
nvdia geforce gtx 590

Introduction

This article is a review of Nvidia Geforce GTX 590. It includes photos, graphs, information, comparisons, game tests and recommendations.

This post is sponsored by DVwarehouse where you can buy refurbished/used computer products for fantastic prices. See the great prices for refurbished apple desktop computers on their website. Buying DVWarehouse computer products helps support us at Techreport!

Nvidia GeForce GTX 590 Graphics Card Review

March has certainly been a month of extremes around here. We kicked it off with a look at the Core i7-990X, a world-beating six-core CPU, and then moved on to the absolutely epic Radeon HD 6990. After that, we investigated a pair of breathtakingly fast SSDs. Now, we’re back on the graphics beat with a premium offering from Nvidia, the GeForce GTX 590. Like the Radeon HD 6990, the GeForce GTX 590 is a dual-GPU video card planted firmly at the top of the lineup.

This is Nvidia’s first attempt at a dually product in quite some time, at least in the frenetically paced graphics market. The last one, the GeForce GTX 295, debuted over two years ago. As we noted in our 6990 review, cramming two high-end GPUs onto a dual-slot expansion card isn’t easy; power and thermal limitations often define these products, more so than most. That’s probably one reason we didn’t see a dual-GPU entrant in the GeForce GTX 400 series. The first-gen chips based on the Fermi GPU architecture were famously late and thermally constrained, making them iffy candidates for the SLI-on-a-stick treatment.

The GF110 GPU in today’s high-end GeForce cards is still a rather enormous chip, but it’s a little easier to tame—and is a formidable rival to the Cayman GPU in the Radeon HD 6900 series. Naturally, then, Nvidia has cooked up an answer to the Radeon HD 6990, one that reveals a decidedly different approach to the extreme dually graphics card.

Full Review of Nvidia GeForce GTX 590

Sizing up Gemini

 

Code-named “Gemini” during its development, the GTX 590 has a pair of GF110 chips onboard, and those GPUs haven’t had any of their onboard hardware disabled. Unit counts therefore mirror those for a pair of GeForce GTX 580 cards in SLI. Yet in order to keep the GTX 590 within a manageable power limit, Nvidia has dialed back the clock speeds to levels well below the GeForce GTX 570’s. The GTX 590’s core clock is just 607MHz, and the GDDR5 memory ticks along at 854MHz—or about 3.4 GT/s. So, although these are fully-enabled GF110 GPUs, the GTX 590’s projected rates for key graphics capabilities look very much like a pair of GeForce GTX 570s, not two full-on GTX 580s.

The Numbers

Here’s a quick look at the numbers.

Peak pixel
fill rate
(Gpixels/s)
Peak bilinear
integer texel
filtering rate
(Gtexels/s)
Peak bilinear
FP16 texel
filtering rate
(Gtexels/s)
Peak shader
arithmetic
(GFLOPS)
Peak
rasterization
rate
(Mtris/s)
Peak
memory
bandwidth
(GB/s)
GeForce GTX 560
Ti
26.3 52.6 52.6 1263 1644 128
GeForce GTX 570 29.3 43.9 43.9 1405 2928 152
GeForce GTX 580 37.1 49.4 49.4 1581 3088 192
GeForce GTX 590 58.3 77.7 77.7 2488 4856 328
Radeon HD 6850 24.8 37.2 18.6 1488 775 128
Radeon HD 6870 28.8 50.4 25.2 2016 900 134
Radeon HD 6950 25.6 70.4 35.2 2253 1600 160
Radeon HD 6970 28.2 84.5 42.2 2703 1760 176
Radeon HD 5970 46.4 116.0 58.0 4640 1450 256
Radeon HD 6990 53.1 159.4 79.7 5100 3320 320
Radeon HD 6990
AUSUM
56.3 169.0 84.5 5407 3520 320

We’re assuming perfect scaling from one GPU to two in the figures above, which isn’t always how things work out in practice. However, these are simply theoretical peaks, and even the most efficient GPUs don’t always maintain these rates in real applications.

On paper, at least, the GTX 590 just beats out the Radeon HD 6990 in ROP throughput and memory bandwidth, two keys to fast operation at high resolutions with edge antialiasing, but it’s slightly slower in other areas. We wouldn’t sound any alarms about the GTX 590’s vastly slower theoretical shader arithmetic rates. Nvidia’s shader architecture tends to be more efficient, delivering performance comparable to AMD’s in many cases, if not superior. Meanwhile, the GTX 590 absolutely crushes the Radeon HD 6990 in peak triangle rasterization rate, which is but one indication of the GF110’s quite real end-to-end superiority in geometry processing and DirectX 11 tessellation throughput. The question there is whether or not Nvidia’s geometry processing advantage will matter in real games, and it’s a vexing one.

All in all, the GTX 590 looks to be endowed with outrageously high specifications. Yet those specs look very much like those of the primary competition, the Radeon HD 6990. This is gonna be a close one, folks.

The Card

 


Dude. Glow.

Like its competition, the GTX 590 presents dual 8-pin aux power inputs to the user, threatening to require a PSU upgrade. The card’s max power rating, or TDP, is 365W, just 10W below the peak power deliverable through the combination of a motherboard’s PCIe x16 slot and a couple of those 8-pin auxiliary inputs. Not coincidentally, that’s also 10W below the Radeon HD 6990’s TDP.

The GTX 590’s expansion slot covers are pierced by three dual-link DVI ports, a mini-DisplayPort connector, and as much thermal venting as the card’s designers could muster. That mini-DP output supports DisplayPort 1.1a, so it’s less capable in several ways than the DisplayPort 1.2 outputs on newer Radeons. Then again, those Radeons can drive only one dual-link DVI display natively; connecting more will require expensive adapters.

When it comes to truly extreme display configurations, Nvidia and AMD have taken different paths. The GTX 590’s dual-link outputs will allow it to power a trio of four-megapixel monitors at once—or three smaller (~2 MP) monitors at 120Hz for wrap-around stereoscopic gaming via Nvidia’s 3D Vision scheme. That DisplayPort output enables the 590 to drive four displays simultaneously, but only for productivity; multi-monitor Surround Gaming is limited to a maximum of three displays. Meanwhile, AMD isn’t nearly as far down the path of cultivating support for stereoscopic 3D, but its Eyefinity multi-monitor gaming scheme will happily support six displays at once. The 6990 can do it, too, thanks to five onboard outputs and the possibility of connecting more monitors via a DisplayPort 1.2 hub. True to this mission, the 6990 also comes with more video memory than the GTX 590—2GB per GPU and 4GB total, versus 1.5GB per and 3GB total on the 590. It’s up to you to choose why you get a headache: from wearing flickery glasses, or from trying to track objects across display bezel boundaries.


GeForce GTX 580 (top) versus GTX 590 (middle) and Radeon HD 6990 (bottom)

If you’re looking for an indication of the differences in philosophy between Nvidia and AMD for cards of this ilk, look no further than the picture above. The GTX 590 is shown sandwiched between Nvidia’s best single-GPU card, the GeForce GTX 580, and the massive Radeon HD 6990. The GTX 580 is a very healthy 10.5″, the 590 is a considerable 11″, and the 6990 is just a smidgen shy of a full 12″. Although the GTX 590’s space requirements are definitely above the average, the 6990 will be problematic in all but the deepest PC enclosures. AMD has aimed for peak extremeness. Nvidia has tailored its solution to be a bit more of a good citizen in this way, among others.


Source: Nvidia.

Source: Nvidia.

Another way the GTX 590 aspires to be easier to get along with? Acoustics. Superficially, this card doesn’t look too terribly different from its rival, with a centrally located fan flanked by dual heatsinks whose copper bases house vapor chambers. However, Nvidia says the GTX 590 isn’t much louder than the GeForce GTX 580—and is thus substantially quieter than the howls-like-a-banshee 6990. We’ll put that claim to the test, of course.

Otherwise, the 590 has all of the sophisticated bits you might expect from a dual-GPU solution of this sort, including a 10-phase power supply with digital VRMs and Nvidia’s familiar NF200 PCI Express switch chip, which routes 16 PCIe 2.0 lanes to each GPU and another 16 lanes to the PCIe x16 slot.

And, yes, there is an SLI connector onboard, raising the prospect of quad SLI configurations based on dual GTX 590s. The card will do it, but Nvidia wants users to be careful about the selection of components to wrap around such a config. It recommends a motherboard with an additional expansion slot space between the PCIe x16 slots, so there’s adequate room between the cards for the interior one’s fan to take in air. The firm is certifying motherboards that meet its qualifications for quad SLI, along with cases and PSUs. Right now, cases are the biggest bugaboo. Only three are certified—Thermaltake’s Element V, SilverStone’s Raven RV02, and CoolerMaster’s HAF X—although more are purportedly coming soon. You could probably build a very nice quad SLI setup with some other popular full-tower cases and the right sort of cooling. Our sense is that Nvidia is emphasizing certification simply because it wants to ensure a good user experience and adequate cooling.

As you might expect, the GTX 590 will be priced at $699.99, exactly opposite the 6990. Cards should be available at online retailers starting today. Interestingly, you’ll only find GTX 590 cards from Asus and EVGA available for sale in North America. In other parts of the world, the 590 will be exclusive to other Nvidia partners. My understanding is the cards have been divvied up in this manner because they’re relatively low-volume products. It may have been deemed impractical to have six or more brands offering them simultaneously in one market. How low volume? When we asked, the firm told us it would be shipping “thousands of cards” worldwide. That’s noteworthy because it’s not tens of thousands—just thousands. That said, Nvidia expects a “steady supply available in the market.” Perhaps the $700 price tag will ensure demand doesn’t exceed supply over time.

One more thing
As you may know, the Radeon HD 6990 comes with an alternative firmware, accessible via a small DIP switch, that enables a configuration dubbed “uber mode” by AMD. The switch that turns on “uber mode” is the “Antilles Unlocking Switch for Uber Mode,” or AUSUM, for short. Because this config exceeds the PCIe power spec and isn’t guaranteed to work properly in all systems, it’s essentially overclocking, though it’s tacitly approved by the GPU maker.

We tested the 6990 with the AUSUM switch enabled, and that raised an issue of fairness. Nvidia hasn’t given the GTX 590 any comparable mechanism, but the card can be overclocked in software. We figured, by all rights, we should test an overclocked configuration for the GTX 590, as well. One has to be careful here, though, because the GF110 chips will definitely reach much higher clock speeds when given enough voltage—we reached 772MHz at 1025 mV, similar to the GTX 580—but you’ll also definitely bump up against the GTX 590’s power limiting circuitry if you push too hard. The result, as we learned, is that performance drops with the supposedly overclocked config.

We eventually decided on a more mildly overclocked config in which the GPU core was raised to 690MHz, the GPU core voltage was increased from 938 mV to 963 mV, and the memory clock was tweaked up to 900MHz (or 3.6 GT/s). This setup was easily achieved with MSI’s Afterburner software, proved quite stable, and, as you’ll see in the following pages, performed consistently better than stock. The only thing left to do then was give these settings a name, since they lacked one. Folks, say hello to Wasson’s Intrepid Clock Konfig, Extreme Dually—or WICKED. We’ve put WICKED and AUSUM head to head to see which is better.

Game Testing the Graphics CArd

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and we’ve reported the median result.

Our test systems were configured like so:

Processor Core
i7-980X
Motherboard Gigabyte EX58-UD5
North bridge X58 IOH
South bridge ICH10R
Memory size 12GB (6 DIMMs)
Memory type Corsair Dominator CMD12GX3M6A1600C8
DDR3 SDRAM
at 1600MHz
Memory timings 8-8-8-24 2T
Chipset drivers INF update 9.1.1.1025
Rapid Storage Technology 9.6.0.1014
Audio Integrated ICH10R/ALC889A
with Realtek R2.58 drivers
Graphics
Dual Radeon HD
6870 1GB
with Catalyst 11.4 preview drivers
Radeon HD
5970 2GB
with Catalyst 11.4 preview drivers
Dual Radeon HD
6950 2GB
with Catalyst 11.4 preview drivers
Radeon HD 6970
2GB
with Catalyst 11.4 preview drivers
Dual Radeon HD
6970 2GB
with Catalyst 11.4 preview drivers
Radeon HD 6990
4GB
with Catalyst 11.4 preview drivers
MSI GeForce
GTX 560 Ti Twin Frozr II 1GB +
Asus GeForce GTX
560 Ti DirectCU II TOP 1GB
with ForceWare 267.26 beta drivers
Zotac
GeForce GTX 570 1280MB
with ForceWare 267.24 beta drivers
Zotac
GeForce GTX 570 1280MB +
GeForce GTX 570 1280MB
with ForceWare 267.24 beta drivers
Zotac
GeForce GTX 580 1536MB
with ForceWare 267.24 beta drivers
Zotac
GeForce GTX 580 1536MB +
Asus GeForce GTX 580 1536MB
with ForceWare 267.24 beta drivers

GeForce GTX 590 3GB
with ForceWare 267.71 beta drivers
Hard drive WD RE3 WD1002FBYS 1TB SATA
Power supply PC Power & Cooling Silencer 750 Watt
OS Windows 7 Ultimate x64 Edition
Service Pack 1

Thanks to Intel, Corsair, Western Digital, Gigabyte, and PC Power & Cooling for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following test applications:

Some further notes on our methods:

  • Many of our performance tests are scripted and repeatable, but for some of the games, including Battlefield: Bad Company 2 and Bulletstorm, we used the Fraps utility to record frame rates while playing a 60- or 90-second sequence from the game. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We raised our sample size, testing each Fraps sequence five times per video card, in order to counteract any variability. We’ve included second-by-second frame rate results from Fraps for those games, and in that case, you’re seeing the results from a single, representative pass through the test sequence.
  • We measured total system power consumption at the wall socket using a Yokogawa WT210 digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Battlefield: Bad Company 2 at a 2560×1600 resolution with 4X AA and 16X anisotropic filtering. We test power with BC2 because we think it’s a solidly representative peak gaming workload.
  • We measured noise levels on our test system, sitting on an open test bench, using an Extech 407738 digital sound level meter. The meter was mounted on a tripod approximately 10″ from the test system at a height even with the top of the video card.You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.
  • We used GPU-Z to log GPU temperatures during our load testing.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Bulletstorm Test

This game is stressful enough on a GPU to make it a decent candidate for testing cards of this type. We turned up all of the game’s image quality settings to their peaks and enabled 8X antialiasing, and then we tested in 90-second gameplay chunks.

The Radeons turn in a relatively strong showing here, with the 6990 essentially matching Nvidia’s fastest dual-GPU solution, a couple of GTX 580s in SLI. At its stock clocks, the GTX 590 performs almost exactly like our GTX 570 SLI setup.

F1 2010 Test

F1 2010 steps in and replaces CodeMasters’ previous effort, DiRT 2, as our racing game of choice. F1 2010 uses DirectX 11 to enhance image quality in a few, select ways. A higher quality FP16 render target improves the game’s high-dynamic-range lighting in DX11. A DX11 pixel shader is used to produce soft shadow edges, and a DX11 Compute Shader is used for higher-quality Gaussian blurs in HDR bloom, lens flares, and the like.

We used this game’s built-in benchmarking facility to script tests at multiple resolutions, always using the “Ultra” quality preset and 8X multisampled antialiasing.

Here’s another very strong showing for the red team. Even the Radeon HD 6950 CrossFireX setup outperforms two GTX 580s in SLI. Hang tight—this surely won’t last.

Civilization V Test

 

Civ V has a bunch of interesting built-in tests. Up first is a compute shader benchmark built into Civilization V. This test measures the GPU’s ability to decompress textures used for the graphically detailed leader characters depicted in the game. The decompression routine is based on a DirectX 11 compute shader. The benchmark reports individual results for a long list of leaders; we’ve averaged those scores to give you the results you see below.

Remember how I said we shouldn’t sound any alarms about the much higher theoretical shader throughput of the Radeons? Here’s an example of why that’s so. Even though the GTX 590 has relatively low clock speeds, and although the performance of multi-GPU setups doesn’t scale well in this test, the 590 comes out ahead of the fastest dual-Radeon config.

In addition to the compute shader test, Civ V has several other built-in benchmarks, including two we think are useful for testing video cards. One of them concentrates on the world leaders presented in the game, which is interesting because the game’s developers have spent quite a bit of effort on generating very high quality images in those scenes, complete with some rather convincing material shaders to accent the hair, clothes, and skin of the characters. This benchmark isn’t necessarily representative of Civ V‘s core gameplay, but it does measure performance in one of the most graphically striking parts of the game. As with the earlier compute shader test, we chose to average the results from the individual leaders.

You can worry a bit here if you’d like, though. This pixel-shader-intensive benchmark runs notably faster on the Radeons.

Another benchmark in Civ V focuses, rightly, on the most taxing part of the core gameplay, when you’re deep into a map and have hundreds of units and structures populating the space. This is when an underpowered GPU can slow down and cause the game to run poorly. This test outputs a generic score that can be a little hard to interpret, so we’ve converted the results into frames per second to make them more readable.

The GTX 590 proves to be faster than the 6990 in this test, although both cards offer more-than-adequate frame rates at these settings.

StarCraft II

Up next is a little game you may have heard of called StarCraft II. We tested SC2 by playing back 33 minutes of a recent two-player match using the game’s replay feature while capturing frame rates with Fraps. Thanks to the relatively long time window involved, we decided not to repeat this test multiple times. The frame rate averages in our bar graphs come from the entire span of time. In order to keep them readable, we’ve focused our frame-by-frame graphs on a shorter window, later in the game.

We tested at the settings shown above, with the notable exception that we also enabled 4X antialiasing via these cards’ respective driver control panels. SC2 doesn’t support AA natively, but we think this class of card can produce playable frame rates with AA enabled—and the game looks better that way.

The GeForces win the day in StarCraft II, and the GTX 590 performs particularly well, with our WICKED config nearly matching dual GTX 580s.

Battlefield: Bad Company 2

BC2 uses DirectX 11, but according to this interview, DX11 is mainly used to speed up soft shadow filtering. The DirectX 10 rendering path produces the same images.

We turned up nearly all of the image quality settings in the game. Our test sessions took place in the first 60 seconds of the “Heart of Darkness” level.

This one is more or less a dead heat, although WICKED outduels AUSUM in more pronounced fashion.

Metro 2033

We decided to test Metro 2033 at multiple image quality levels rather than multiple resolutions, because there’s quite a bit of opportunity to burden these graphics cards simply using this game’s more complex shader effects. We used three different quality presets built into the game’s benchmark utility, with the performance-destroying advanced depth-of-field shader disabled and tessellation enabled in each case.

The GeForce cards have an edge at the lower image quality presets, where frame rates are into the hundreds. Once we turn up all of the shader effects and detail settings, though, the standings even out somewhat. The result: the 6990 noses past the GTX 590, and AUSUM overcomes WICKED.

Aliens vs. Predator

AvP uses several DirectX 11 features to improve image quality and performance, including tessellation, advanced shadow sampling, and DX11-enhanced multisampled anti-aliasing. Naturally, we were pleased when the game’s developers put together an easily scriptable benchmark tool. This benchmark cycles through a range of scenes in the game, including one spot where a horde of tessellated aliens comes crawling down the floor, ceiling, and walls of a corridor.

For these tests, we turned up all of the image quality options to the max, along with 4X antialiasing and 16X anisotropic filtering.

The 6990 takes our final game test, while the GTX 590 falls in place right behind dual GTX 570s in SLI.

Power consumption

 

Although the GTX 590’s TDP rating is 10W lower than the 6990’s, the new GeForce draws substantially more power here than the Radeon. This isn’t an absolute max power situation—we’re running a game, not a synthetic stress test or the like—so the results aren’t necessarily surprising. The 6990 does look to be more power-efficient than the GTX 590, though, both at idle and when running Bad Company 2. In fact, the AUSUM 6990’s power use is comparable to the stock GTX 590’s. The WICKED config demonstrates why, perhaps, Nvidia hasn’t pushed any harder on GPU frequencies. While the clock headroom is there, the power headroom is not.

Noise levels and GPU temperatures

Here’s the eye-popping result of the day. Although the GTX 590 draws more power than the Radeon HD 6990 under load, it still registers as roughly 10 decibels quieter than the Radeon on our sound level meter. Subjectively, the difference is huge. The 6990 fills the room with a rough hissing sound, while the GTX 590 isn’t much louder than an average high-end video card. Even when it’s overclocked, the WICKED 590 is quieter than the stock 6990 by a fair amount.

One contributor to the difference is revealed in the GPU temperature results. The 6990’s fan control profile is relatively aggressive about keeping GPU temperatures low, perhaps out of necessity. The GTX 590 lands in the middle of this pack, at least until it goes WICKED.

Conclusions

Most folks seem to enjoy these value scatter plots, so let me drop one on you.

We’ve taken the results from the highest resolution or most intensive setting of each game tested, averaged them, and combined them with the lowest prevailing price at Newegg for each of these configurations. Doing so gives us a nice distribution of price-performance mixes, with the best tending toward the upper left and the worst toward the bottom right.

Keep in mind that we’ve only tested seven games, and that these standings could change quickly if we altered the mix. Still, based on our tests, the Radeon HD 6990 has an appreciable performance lead over the GTX 590 at the same price. Yes, that lead largely evaporates with our WICKED overclocked config, but going WICKED involves some pretty extreme power draw, even compared to AMD’s (admittedly more conservative) AUSUM option.

The GTX 590 is still breathtakingly fast—much quicker than a single GeForce GTX 580 and nearly as quick as a pair of GeForce GTX 570s or Radeon HD 6950s—but its true distinction, in our view, is its wondrously soft-spoken cooling solution. The GTX 590’s cooler is vastly quieter than the boisterous blower on the Radeon HD 6990. Combine that acoustic reality with the GTX 590’s shorter 11″ board length and understated appearance, and a sense of its personality begins to take shape. This card is more buttoned-down than the 6990. There’s no AUSUM switch, no bright red accents showing through the case window, and no obvious aural proclamation that Lots of Work is Being Done Here.

Frankly, I like that personality. If I were spending $700 on a dual-GPU graphics card for my ideal PC, I’d probably choose the GTX 590, even if it did mean sacrificing the absolute best performance.

But choosing the GTX 590 does mean making that sacrifice, and I’m not sure how that plays in the world of uber-extreme PC hardware components, where speed and specs have always been paramount. Prospective buyers of these rather exclusive video cards have an intriguing choice to make. In my view, the image quality and feature sets between the two GPU brands are roughly equal right now. The prices are the same, and the Radeon HD 6990 has more of nearly everything: frames per second, onboard memory, video outputs, as well as noise and board length. The 6990 has the most. Could it be that it’s still not the best?

The post Full Review of Nvidia GeForce GTX 590 appeared first on The Tech Report.

]]>
20 kW solar system: Full Installation Walk-Through https://techreport.com/review/dont-skimp-on-the-power-supply-a-20-kw-solar-installation-reviewed/ https://techreport.com/review/dont-skimp-on-the-power-supply-a-20-kw-solar-installation-reviewed/#comments Mon, 23 May 2022 00:40:33 +0000 https://techreport.com/?p=3463306

Introduction This article is about the installation of a 20 kw solar system. The article explains everything you need to know and more about the installation of one of our...

The post 20 kW solar system: Full Installation Walk-Through appeared first on The Tech Report.

]]>

Introduction

This article is about the installation of a 20 kw solar system. The article explains everything you need to know and more about the installation of one of our reviewers.

Why I Purchased a 20 kw solar system

There are a lot of reasons I decided to have a big honkin’ solar array installed on my roof. I’ll get to the tangible ones shortly—backed by cold, hard data. The main reason that comes to mind, though, is less concrete. It’s a feeling, or an instinct; a desire to realize the expectations of my 20th-century childhood as a 21st-century adult. This is something I’ve wanted even before I plopped down my first solar power plant as an 11-year old SimCity 2000 mayor. My infatuation with solar arrays has grown.

Now, I understand that the emotional investment I have in this venture isn’t something everyone is going to share. Still, it’s an undeniable factor in my decision-making process. My parents were promised an inevitable fusionpoweredrobotfilledsmarthome by science fiction when they were kids. Hanna-Barbera’s vision of the future may not have come to pass, but I want my daughters to grow up in that futuristic fantasy, so it’s up to me to deliver on that dream.

Possible Sound Investment

Beyond being simply a cool toy that I’ve wanted since I was a kid, or a sound investment I can make now that I’m all grown up, I consider this project to be a personal moral imperative. That may be a ironic phrase to invoke for what should be a by-the-numbers adventure, but I was happy to find that both my wallet and my conscience would be soothed by this endeavor. The math works out, but I’ll add this much: I’ve already left my mark on the planet, and now I want to erase it. I think this is a good start.

And so, it begins

Starting With Research

As any adventure should, I began with research. Question number one was, “does solar power even make sense in Michigan?” Following closely behind, there was a second question of “exactly how large of an installation would I need?” There are tons of calculators online for finding those answers, and I used more than a few to get multiple opinions. Ultimately all the numbers really clicked for me once I saw the map below. It breaks down solar irradiance into kWh-per-kW-of-panels-installed, per year.


Image source: solarpowerrocks.com

That image is backed up by the feds and other resources online. You can see that Michigan’s 1400 kWh/kW-yr ranking isn’t amazing, but it isn’t tragic either. Since my goal was to offset 100% of my electrical consumption with solar energy, the math here is pretty easy. All I had to do was add up my annual kWh usage based on my electrical bill, then solve for X. In 2018, our household included an entire extra family for eight months. That lead to an over-the-top electrical consumption of 25,650 kWh for the year. In 2017, our usage was more normal, clocking in at 20,263 kWh. Overkill is underrated, so I used the total from 2018 for my calculations. Also, solar panels lose about 0.5% of their production each year—another reason to aim high.

Backed by Peak Potential Output

A solar panel system’s capacity is expressed as its peak potential output, or kWp. I divided my consumption of 25,650 kWh by the 1400 kWh/kW-yr from the map and came up with a requirement of an 18.32 kWp system. The calculations aren’t quite that easy, though. There are other factors to consider, like the laws of physics. Any system is going to have efficiency losses. Roughly speaking, you’re talking about needing an extra 20-25% more capacity to counter those losses. That brings my requirements up to 22.9 kWp for my 2018 numbers and 18.09 kWp if I look back to my 2017 usage. With that, I had my rough estimate.

Of course, there’s still more to it than that. There’s potential shade to consider, as well as the directions the panels are facing, and the pitch of your roof. Those variables are a bit too specific to be within the scope of this piece, though. Just know that they will come into play when you start fooling with full-fledged solar calculators online. In my case, you’ll soon see that shade was not a problem for me. I do however have an east-west facing roof instead of a the idyllic southern facing one. As it turns out, that’s not nearly as significant a factor as it used to be with today’s modern panels and their prices.

Speaking of problems—Fish, you idiot. The. Sun. Goes. Down. That’s where “net metering” comes in, though. Net metering means you can bank credit with your power company when you produce more power than what you’re using. Think of it as using the electrical grid as a battery, at least financially. Alternatively, think of it as old-school rollover minutes on your cellphone. You’ll have to do your homework to see if it’s available where you live, but it made everything a lot easier (and cheaper) for me.

The Importance of Net Metering

Net metering is critical, because unless you plan on storing all the excess power yourself (more on that later), any solar panel array you invest in is only going to lower your bill while it’s actively generating power. You’ll still be drawing from the grid when it’s dark, cloudy, or when the panels are covered in snow. Without net metering, there would be no point to installing a system capable of producing more power than you use during the day. The bottom line is that with that ability, you can install whatever size system you want without having to worry about what to do with the power you’re not using in real time.


Number of minutes between sunrise and sunset for my latitude and longitude. Source: USNO

In my case, the type of net metering at my disposal means that any excess power I produce in a month carries over to the next month as a credit on my bill. I don’t get paid cash for producing more power than I need, but the credit stays on my account for 12 months. At that point, any surplus I may still have drops off. It’s early days yet for my system, but my hope for this year is that I’ll bank up enough credit over the spring, summer, and fall to be able to get through next winter (when the panels are covered with snow) by pulling from my credits. We’ll talk about how that’s working out so far in a bit, and my intention is to revisit that specific topic six months from now, and again six months after that.

Getting Serious About the Purchase

After my research, I had a rough idea of what size system I needed to generate my annual power requirements. I also knew that, even though I’m not afraid of some DIY work, this was more project than I could handle without help. It was time to get some quotes and figure out if my plan was even remotely practical. I mentioned that there are lots of solar power calculators online, but there are also a number of services that will collect your details and share them with installers so they can produce bids for the job. I ended up using EnergySage for this step in the process and I highly recommend it. Answer a few questions, share a recent electric bill, and then sit back and wait.

Getting Quotes

After a week or so, I ended up with quotes from three different installers, all of them operating within about four hours of where I live. EnergySage’s portal summarized the quotes and made them easy to compare. It also audited them to a degree, playing the role of a third-party fact checker. The portal provided a means of  communication with the installers that didn’t require me to share my direct contact info, which was welcome for the early stages of dialogue with them. All the quotes ended up in the same ballpark for cost and payback, but there were some differences. Each installer had their own preference for what brand of hardware to use and how much of it would be needed. The warranties differed too, both for the hardware and for the labor.

After engaging everyone in conversation for a couple weeks, I fleshed out the quotes by adding a critical load panel and requisite battery to them. I ultimately chose Strawberry Solar to do the job based on its combination of price, warranty, responsiveness, reviews, and its choice of hardware. I’ll say this upfront: Strawberry was and continues to be a pleasure to work with. If you’re interested in a solar project of your own, and they’re a candidate to do the job, I highly recommend the company.

The process that lead to signing on the dotted line revealed one reason the quotes I got were so similar. It turned out that based on my electric usage, the system my household demanded was hitting the 20 kWp cap for residential solar power in Michigan. Even if that weren’t the case, any effort to go higher than 20 kWp would result in a “Category 2” program where I couldn’t get a credit for my power delivery rate, only for the supply rate. You can read details about that here, but it’s specific to my power company and could vary elsewhere. At any rate, the 20.13 kWp system that Strawberry landed on was right in the middle of my own estimate of 18-23 kWp, which was fine by me. The 20 kw solar system would be installed.

As things worked out, the system Strawberry installed ended up being 20.46 kWp. That number is reached by multiplying the 62 panels on my roof by their rated peak performance of 330 W each. The original quote was for 61 panels, but Strawberry tossed in a freebie to make the array look nicer. As it turns out, the installers were just as excited about setting up a big honkin’ system like mine as I was. I asked about the size of the system putting me over the limit for “Category 1” net metering, but I was assured that inefficiencies in switching DC to AC power meant that having slightly over 20 kWp in panels was not a problem. In fact, the two inverters in my basement only total up to 19 kW anyway.

The Hardware of a 20 kw Solar System

The Hardware Specifics (not too technical) 

Speaking of the inverters, let’s dive into the specifics a bit. Fair warning: this won’t be a technical deep dive. It’s just to whet your appetite with an idea of what one of these systems looks like, and what, uh, powers it. A lot of new stuff got added next to the existing electrical panel in my basement. There’s a new meter, a transformer, two inverters, a critical load panel, a box for breakers that allows both inverters to charge the battery, the battery itself, and a large raceway full of wires that connect everything together. We’ll take a quick look at each piece.

The meter is an unassuming box with an important job: it tracks incoming and outgoing power from the grid. It’s also responsible for making sure that power from the system doesn’t feed back to the grid when there is an outage, ensuring the safety of folks working to restore power. The transformer handles switching the DC power from the battery into AC power that can be used by anything connected to the critical load panel during a power outage.

The critical load panel doesn’t have any particular smarts; it’s just full of normal breakers. However, in the event of a power outage, the items hooked up to it will continue to run, either directly from solar power coming from the inverters or from the big battery, by way of the transformer. Regardless of how much solar energy is being produced, only the critical load panel will have power during an outage. We’ve got our sump pump, furnace, refrigerator, a circuit that power’s Ellie’s room, and our cable modem and router hooked up to it. In the event of an outage, the basement will stay dry, the house will be warm, food’s chilled, Ellie’s humidified, and battery-powered devices online.

The battery itself is an LG Chem RESU, and it can store 9.8 kWh. Based on the loads it sees, that should be enough to last through most nights until the panels can take over and charge it up during the day. Hopefully, we don’t have to deal with many outages that last that long, though. I can’t stress enough how much peace of mind having the battery in the system gives me. Even if it was just for the sump pump alone, it would save me a ton of stress. In the worst-case scenario, it’s a huge buffer between when the power goes out and when I need to fight with the generator.

9.8 kWh ought to be enough for anybody.

We have the battery configured so that power is not typically drawn from it unless there is an outage. It could be configured in such a way as to minimize our power draw from the grid, but thanks to net metering, that isn’t a big concern for me. Instead, I’d rather make sure the battery is always topped off and ready to go in the event of an outage. The controller maintains the battery by automatically cycling it to optimize its lifespan.

More on The Hardware Specifics

When the system was first powered on, the battery was only 20% charged. I watched in astonishment as it drew 5,000 W directly from the solar panels to charge itself at the rate of about 1% a minute. It was nuts! So far, we’ve only had one small power outage to let us see what the battery could do. Everything worked exactly as planned, which ended up meaning the critical load panel ran directly from the solar panels and never touched the battery. Daytime outages: solved. I continue to be both nervous and excited to see what happens during a longer outage when it’s overcast and rainy the next day. Will the battery make it through the night? Will the panels be able to charge it the next day? I don’t want the power to go out, but I do want answers to those questions.

The cat room will never be the same.

Finally we get to the inverters, a SolarEdge SE11400 and a SolarEdge SE7600. They are 11.4 kW and 7.6 kW respectively, totaling up to the 19 kW I mentioned previously. They are attached directly to the Canadian Solar panels and SolarEdge power optimizers on the roof. By the way, mad props to the electricians that hooked all this gear up. It’s so organized, and I think it looks awesome. They also managed to fish the wires from the roof-mounted panels through the roof, down an interior wall, and into the basement without so much as a single hole in any of my drywall. Very impressive.

Length of Installation

The installation of all the hardware in my basement took about three eight-hour days for two electricians. That happened after four people spent two eight-hour days installing all of the panels on the roof. Since I was never up on the roof myself, I’m not going to go into great detail about the installation process, but I will say that I know they used hardware from IronRidge and that everything on the roof was drilled right into the trusses. The flashings that were used were tucked under the roof’s shingles and sealed with tar on the bottom. No worries about leaks here.

Did I mention that the installation was originally scheduled for the week of the great 2018 polar vortex? Yeah, we had to reschedule it for the following week, but it was still wet, bitter cold, and windy. The install team were beasts. Here’s a few pics of the initial install.

I’ve also got a video from the deck on the other side of the house right after they finished putting the panels up.


There’s not that much snow, but the temp was barely in the double digits and wind was well into them.

Here’s some obligatory drone footage as well; more of that to come.


Don’t worry, I got a haircut shortly after seeing myself in this footage.

 

Final Costs, Specs, and Rewards

That’s a fair question. Let’s break down the costs, and compare them to my electric bill for a rough idea of what the ROI looks like. I’m not going to pussyfoot around—this is hard data, albeit with slightly-rounded numbers for simplicity. The complete system cost $65,000. I’ve been planning on this for a long time and was able to pay $15,000 down. We got a home equity loan for the remaining $50,000. It’s a 12-year loan with payments of $475 a month. My family-heavy electric bill averaged $350 a month last year, and in 2017 it averaged $275 a month. If you take the absolute worst-case scenario where electric costs don’t go up, I don’t pay off the loan early, and we use the 2017 bill for comparison, the system will pay for itself in 27 years. Oof.

Thankfully, that’s not how things are going to go down. For one, and this is a big deal, there’s currently a 30% tax credit from the feds that you can claim for solar projects. That works out to $19,500 in taxes that I don’t have to pay until it’s used up. By the way, and this is also a big deal, 2019 is the last year that tax credit is 30%. Next year it drops to 26%, and it only gets lower from there until it’s gone. You can only claim the credit if your system is installed and operational before the end of the year, so get cracking if you’re interested, because it will take a least a few months to get one in place.

With the tax credit as part of the equation, the EnergySage portal produced an estimated payback date of 8-10 years based on the data I fed it once they had my quotes. It gets speculative from here, because my electrical consumption is lower than what I advertised, and I plan to pay off my 12-year loan in closer to four years to avoid much of the interest. My best guess is that if my electrical consumption remains the same and electrical costs remain the same, I would get a full return on the investment in close to 14 years.

However, I expect to use more power as time goes on (an electric car is up once the solar loan is paid off), and electrical costs will rise, so 10-12 years seems a more likely timeframe. Of course, the system is going to last another 20 years or so beyond that. Minus some maintenance costs, everything it saves after that is money in the bank. By the time the system reaches its end of life, it will not only have paid for itself, but it should also have saved me well over $100,000. In 2052, that could be enough money to get my inevitably decrepit 70-year old body to Mars and turn it into fertilizer. I’m more than half-serious. I can hear my future-self pontificating now: “your ironically named Mr. Fusion may run off Bananorango® peels but my antique solar panels still charge my exoskeleton just fine, dagnabbit!”

Anyway, that’s the situation for me. A lot of things could be different for you, and I encourage you to look into your options. Much of the pitch for solar that you see is that you can straight-up trade your electric bill for a loan payment. That’s definitely not true in my case, but it certainly could be for others or, even for me if I had done things differently.

System performance

With all the exposition out of the way, we can finally get to how the system is performing. For only having it producing power for a couple months, I feel like I’ve seen it running in a pretty wide range of conditions. It handles some situations better than others, but overall it’s outperforming my expectations, especially in inclement weather. No, it doesn’t magically produce power when the panels are completely covered in snow, but in cloudy, rainy, or snowy weather it manages to generate a surprising amount of energy nonetheless. All the details regarding production are visible through the SolarEdge web portal and companion app. Check it out:


I spend an unhealthy amount of time staring at “my numbers.”

This is my data. Isn’t it lovely? I think the numbers speak for themselves, but to summarize, it shows me everything about how much power is being generated and consumed in near-real-time. I can see my various totals for the day, week, month, and year. It even shows me the state of my battery since one of the inverters is its boss. The data in this image is from 14:30, near peak time for production. You can see that the system is absolutely obliterating the household’s meager demand for power and is putting 85% of the power being generated back onto the grid. This will be a common theme.

‘Not bad for mid-March.

This screenshot is from a different day, but the story is largely the same. The partial graph above begins to tell the story of how net metering and my geographical location need to work together. Simply put, I’m not just looking at things on a day-to-day basis. In order for my system to completely eliminate my electric bill, I need it to cast a wide net and catch as much sun as possible whenever the opportunity presents itself. Generating two, three, or maybe four times as much power as I need in 24 hours is necessary to bank up enough credit to get through the months of the year where the panels are unlikely to see much sun at all thanks to the snow.

Before you ask, it’s generally frowned upon to clear snow off panels. They are hard to reach and there’s a non-negligible chance of damaging them, but it can be done if you’re so inclined. If they aren’t covered in snow, they actually perform better in the cold. Hooray for silicon!

The panels on the top are facing west, so I guess it must be past noon.

On the “layout” page of the portal, you can see the arrangement of the panels on the roof and get both real-time and historic performance data per individual panel. It’s extremely cool, but it’s also a bit flakey. I’m still working with Strawberry and SolarEdge to make sure everything with this reporting tool is working correctly with my setup, because I sometimes see unexpected numbers here. That said, it’s clearly only a cosmetic problem, because the system is generating power correctly and the reported totals align with what my power company is seeing from its meter.


Just a little bit of power coming in where the snow has melted.

Information wants to be free

One of the coolest things about the SolarEdge portal—and the reason I didn’t cover it in more detail—is that you can make the data for your installation publicly available. That’s exactly what I’ve done, to the full extent that I can. Use this link to go directly to my own personal site, nicknamed Sunfish. You can also browse other public sites here. You won’t be able to see exactly the same information I can, but you can view real-time performance, the layout of the panels, and all the historical data any way you want to slice it.

That feature is nice and it was real easy to set up, but I wanted to give gerbils more than just graphs. I wanted them to see the weather that goes with them, so I settled on procuring a dedicated time lapse camera with a weather resistant housing and making sky-watch videos with a performance graph overlay. Here’s an example of what I ended up with.


A classic mid-April Michigan snow storm made for a cool set of back to back videos.

I’ve got a growing playlist here if you want to check out more. I thought it was important to show people exactly how the performance of solar power matches up with what’s going on in the sky. I intend to keep adding to this playlist in batches—the camera is outside 24/7, and I’ll pull the daily videos off it once a week or so. How long I’ll keep this running will depend on how interesting people find it, so let me know what you think and please share it if you find it worthwhile. I have no shortage of ideas for other uses for the camera, but I’ll keep it pointed at the sky if people are enjoying the view off our back deck. Personally, I’m excited to string a bunch of these videos together later in the year and produce a high-speed playback of the field growing up.

A Look at The Fully Installed System

Speaking of views, it’s about time I shared a proper look at the fully installed system. Here’s some more drone shots.

I had this project in mind when we bought this house close to four years ago. Even at the time, I knew that a large, southern-facing roof would be more ideal, but after checking out those photos, I’m pretty sure you’ll agree that the house was a pretty good candidate for roof-top solar regardless. There’s no shadow cast on the roof from anything on the ground, and since the panels face east and west, they collect sun early in the morning and they do late into the evening. The best times for production change throughout the year, but right now in late April, the peak appears to be happening from 13:15 to 14:15 when the panels on both sides of the roof are collecting energy at the same rate.

More Videos for Perspective

Here are a couple more videos for additional perspective.


When I watch this, I still think the panels look like CGI.

Seriously, with the weather from that day, the panels just look freaky up there.

Again, you can really see how our house sticks out of the small peninsula of a private drive, right out into the middle of open fields with no trees or buildings blocking the sun. We’re not even close to the longest days of the year yet, and my panels are already generating more power than the house is using from about 08:00 to 20:00, if there aren’t too many clouds. I love it.

Speaking of the weather, I’ve never been more in-tune with it, nor have I ever appreciated sunshine more (or, you know, at all). Everyone knows about the weather rock, but my solar panels actually work similarly. I can look back through my SolarEdge data and tell a lot about how the weather must have looked that day. Increasingly, I’m able to see the weather forecast and predict how many kWh the system will likely produce.

It’s a fun new aspect of conversation to have around the house as well as at the office, where some of my coworkers are nearly as obsessed with my numbers as I am. As a side note, my system inspired one of my coworkers to install 6.4 kW of panels on his own home. He’s taking a DIY approach and won’t have it finished for a few months, but his investment will probably be less than half the dollar-per-kW price tag of my professionally-installed system. Just something to keep in mind if you’re handy enough.

Final Thoughts on the 20 kw Solar System Installaton

Over a month ago, we put up a poll asking folks what their household’s average kWh usage is. Even though the results of any online poll should be taken with a massive helping of molten salt, I feel confident that my vote of 61-75 kWh definitely places me in the upper echelons of power usage. Even if we only look at the close-to-40-kWh average that my house has used through March and April, I’m still above average for gerbils. The nice thing about solar projects like mine is that they scale pretty well, and as long as your roof has a clean view of the sky, the ROI is probably similar no matter where your usage falls. I suppose what I’m trying to say is that the concept is worth at least looking into, especially when the feds will give you a 30% tax credit.

This story isn’t over, not by a long shot. I’m only two months into a 30-year-long endeavor. My biggest takeaways so far are that in less than 60 days I’ve produced 60% more power than I’ve consumed, and I’ve exported a little over twice as much as I’ve imported. Funnily enough, I’ve become so cognizant of my power consumption that I’m using measurably less even though I have a significant surplus.

That all feels good, but I’ve yet to turn on the AC this year, and that will certainly have a dramatic impact on usage. The same lack of shade that makes my house good for solar also makes it a bear to keep cool. I hope that with longer days coming, I’ll be able to continue building up my electricity credit despite the need for AC, and get myself through the winter without paying for power. I didn’t have an electrical bill for March, and I won’t have one for April. I’d love for that to be true for every month going forward.

That’s one huge reason for doing this, after all. Well, that and zombie preparedness. I have no illusions of surviving a zombie apocalypse—rule #1 is a problem—but that doesn’t mean my house can’t be an electric oasis for someone else.

I’m less than half-serious about that one.

More in the Future!

I’m going to revisit this story at least twice more: once after the summer is over, and again after next winter. I’m looking forward to sharing stories of production records being broken (125 kWh is the number to beat so far), the LG battery backup saving the day, and filling in some of the other details I haven’t gotten to in this first brush with the topic. If you found what I’ve shared so far valuable or at least interesting, please share it with others and let me know what you think. I’ll check in with you again in October.

Related Post: Top 7 Outdoor Projectors to Check Out This Year

The post 20 kW solar system: Full Installation Walk-Through appeared first on The Tech Report.

]]>
https://techreport.com/review/dont-skimp-on-the-power-supply-a-20-kw-solar-installation-reviewed/feed/ 191
Innovations in Tech Going Above and Beyond https://techreport.com/gadget-digest/innovations-in-tech/ Sun, 22 May 2022 18:05:52 +0000 https://techreport.com/?p=3475344 Some innovations in gadgets have drastically changed people's lives and made them reliant on newer technology to improve their lives.

Some innovations in gadgets have drastically changed people’s lives and made them reliant on newer technology to improve their lives. Is it possible to be surprised by innovations in new...

The post Innovations in Tech Going Above and Beyond appeared first on The Tech Report.

]]>
Some innovations in gadgets have drastically changed people's lives and made them reliant on newer technology to improve their lives.

Some innovations in gadgets have drastically changed people’s lives and made them reliant on newer technology to improve their lives.

Is it possible to be surprised by innovations in new gadgets? Today’s devices work on notions that appeared unthinkable decades ago. Every day, a new gadget delights users and introduces them to fresh discoveries. The number of amazing devices in every person’s life is astounding.

Today’s gadgets are not only cool but also fantastic and rapid solutions to common difficulties. Many of these gadgets balance practicality and coolness, making them more desirable. It’s hard to keep up with new technological developments or predict what’s next in the digital world. However, here’s a look at what’s coming up in the New Year.

Virtual Reality

Virtual reality (VR) has given techies a new set of tools to play with. VR headsets have been used in gaming for awhile. However, VR is on its way to other industries like healthcare.

With VR headsets, surgeons can practice procedures and treat trauma patients. In addition, VR could be a PTSD therapy tool. Additionally, NASA has jumped on board this trend and uses a VR simulator with a robotic crane to simulate gravity.

Robotic Vacuums Developments

Cleaning is a time-consuming task. Robotic vacuum cleaners help save time. They are also user-friendly and provide superior floor cleaning tools. Some models even have a navigation system to help them know where to go.

Most of today’s robotic vacuum cleaners are pricey and out of reach of most users, so many tech companies are striving to make more affordable versions. In addition, versatility is a goal of upcoming robotic vacuum cleaners with some models using laser navigation and others able to clean different flooring types.

Smart TVs doing more

Smart TVs have revolutionized entertainment with most tech companies are vying for leadership in smart TV innovations. Future Smart TVs may become the hub of a smart household. OLED panels, popularized by smartphones, will soon be available on smart TVs. The OLED + 8K impact makes the smart TV bigger, curvier, and smarter.

Transparent TV screens are also seeing developments. These TV screens are incredible. Though still in the early stages, it has been proven that OLED can be used to build these TV panels. In addition, the future of smart TVs includes data security.

Innovations in Air Purifiers

Cleaner air improves the quality of life. Air purifiers are ideal for cleaning the air in enclosed spaces. With the increasing amount of pollution, these devices are great for those with respiratory or health issues.

Air purifying curtains and portable air purifiers will be in demand in the future. Customers are ready for air purifiers that can track humidity, temperature, and air quality in real-time.
The efficiency of the sensors in these devices is ready to experience significant advances.

Every day, they introduce new devices to make life healthier and easier. These devices and their innovations are revolutionizing entertainment, health, communication, and nearly every other industry in the world.

The post Innovations in Tech Going Above and Beyond appeared first on The Tech Report.

]]>
When’s the Playdate? A New Gaming Machine That Fits in Your Pocket https://techreport.com/gadget-digest/gaming-machine-playdate/ Fri, 20 May 2022 21:35:27 +0000 https://techreport.com/?p=3477460 In this article, we take a look at the upcoming release of the Playdate, a new low-cost handheld gaming machine that can fit in your pocket.

Gaming is now an integral part of our society. Since the days of Atari Pong, gaming has only grown steadily more popular. Nowadays, the average gaming machine is exceedingly complex...

The post When’s the Playdate? A New Gaming Machine That Fits in Your Pocket appeared first on The Tech Report.

]]>
In this article, we take a look at the upcoming release of the Playdate, a new low-cost handheld gaming machine that can fit in your pocket.

Gaming is now an integral part of our society. Since the days of Atari Pong, gaming has only grown steadily more popular. Nowadays, the average gaming machine is exceedingly complex and innovative. Devices such as high-powered gaming PC systems, virtual reality headsets, and next-gen consoles are now in homes nearly everywhere.

However, many of us may fondly remember using handheld gaming devices in our childhood. That nostalgia has a powerful pull on the collective imagination.

Problem solved. The new Playdate, from Panic, is a new handheld gaming machine that will channel that nostalgia. Below, we’ll take a quick look at the new gaming machine that’s small enough to fit in your pocket.

Not “when” is the playdate…what is the Playdate?

The Playdate is a small, portable gaming device that can fit in your pocket. The device has a rectangular shape and features a bright yellow color.

The device features a sort of classic black-and-white reflective display screen. The display is surprisingly gorgeous and helps to further the device’s nostalgic appeal. When the console is switched off the screen will function as a clock, displaying the time.

Playdate devices come with their own unique games that can’t be found anywhere else.

The device features a directional pad and buttons as well as a small crank on the side that can unfold. The crank does not actually power or charge the device, but can instead provide functionality for certain games on the device. The Playdate also comes with WiFi functionality, Bluetooth, and even a small speaker attachment.

Original Games

As mentioned above, the Playdate comes with 24 original and unique games. However, these games don’t come with the console right away. Instead, users will receive two games each week after setting up their device for 12 weeks.

These games vary across styles and genres, but each is made specifically for the Playdate. With 24 different styles of gaming to try and experiment with, you are sure to find something you love.

Games can range from titles such as DemonQuest to a chess simulator. Each game is uniquely developed and has its own unique controls. Some of which even allows users to utilize the crank on the device to control aspects of the game. For example, the game Crankin’s Time Travel Adventure allows the user to wind the crank in either direction to move forward and backward in time.

Playdate continues to develop new games as well, promising new game releases to come soon.

If you or someone you know is interested in designing a game, Panic makes it relatively easy to do so. While it is unavailable currently, Panic plans to release its own SDK which will allow users to design their own games for free. Furthermore, if you are looking for a simpler option they will also be releasing a point and click design tool for the web browser. Designing a video game will be easier than ever before.

Who is Panic?

The developer of the Playdate, Panic, is a US-based software company. Headquartered in Oregon, Panic software studios began in 1999.

They originally worked primarily with Mac OS software, but recently have ventured into the gaming front. Their game, Untitled Goose Game, received massive praise from players and critics alike. For the release of the Playdate, Panic is partnering with Swedish electronics manufacturers,  Teenage Engineering.

Where can you buy the Playdate system? And how much does it cost?

Interested consumers can purchase the Playdate handheld gaming machine from the Playdate store. Users can put in a pre-order now at $179.

Coming with WiFi, Bluetooth, and other features, the Playdate is one of the cheapest gaming options available. With payment, users will get access to all 24 games, as well as access to whatever new games they release.

Customers can also pre-order a cover for their Playdate console for $29 which will help protect the device from damage.

Final Thoughts

With innovation at an all-time high, gadgets and devices are more complex than ever before. And most of the time we really enjoy having these new technological wonders.

However, sometimes it is nice to take a step backward. With the Playdate, users can bring back the nostalgic handheld gaming feeling of old. With a bright, colorful design, and a classic black and white screen display, the Playdate truly has a nostalgic look to it.

Combine that with modern features like WiFi and Bluetooth, and a completely original game library and it is easy to get excited about the Playdate.

Nobody is expecting this device to directly compete with the likes of Microsoft, Sony, or Nintendo. However, at an affordable price of $179, the Playdate can offer a cool alternative for those who don’t want the complexity of some modern gaming systems. With a release sometime this year, we will be keeping our eye on the Playdate handheld gaming machine.

The post When’s the Playdate? A New Gaming Machine That Fits in Your Pocket appeared first on The Tech Report.

]]>
The Most Popular Wearable Medical Tech https://techreport.com/gadget-digest/popular-wearable-medical-tech/ Thu, 19 May 2022 21:04:10 +0000 https://techreport.com/?p=3477346 The Most Popular Wearable Medical Tech

Did you know that the smartphone in your pocket has more computing power than the computer used to put a man on the moon? Thanks to innovation and capitalism, our...

The post The Most Popular Wearable Medical Tech appeared first on The Tech Report.

]]>
The Most Popular Wearable Medical Tech

Did you know that the smartphone in your pocket has more computing power than the computer used to put a man on the moon? Thanks to innovation and capitalism, our technology has improved rapidly in the last 50 years. In just that short time we have seen the creation of products such as cell phones, the internet, social media, and more. In just the past few years, wearable medical tech has even become commonplace.

Demand and consumption fuel new innovations and gadgets and the latest trend in innovative technology is wearables. Millions of consumers worldwide now have access to wearable technologies and the many benefits they can bring. One of the major beneficiaries of these wearable technologies is our health and fitness. This article will examine a few of the most popular examples of wearable medical tech.

Advantages of Wearable Medical Technology

Before we take a look at some of the products in the market, we should first talk about why wearable medical tech is gaining traction.

One of the primary benefits of these wearables is the ability to constantly monitor and analyze vitals. Many wearables will track data such as heart rate and blood flow. These devices can then use this information to quickly alert others and help in an emergency. They can also help guide you towards preventive treatment options.

Along with this constant monitoring also comes an element of peace of mind. By equipping your loved ones with wearable medical tech, you can ensure that they will never be left completely alone in an emergency.

Some wearables are even capable of alerting help in the event of a fall. This could be particularly helpful in the case of senior citizens living on their own. By equipping them with wearable technology, we can enhance their ability to find help if anything goes wrong.

Last but not least, wearable medical tech offers much greater ease of access compared to traditional healthcare.

One of the major issues many find with current healthcare is the high pricing and limited resources. Not only is it largely inconvenient to access these programs, but without insurance, it can be extremely expensive. Wearable medical tech offers a cost-effective alternative that works from the comfort of your home. The shift to wearable technology means that more people will be able to access quality health care at an affordable price.

All that said, let’s take a look at some of the best wearable medical technology currently on the market.

Kokoon Relax Headphones

The first wearable on our list seeks to help you maximize the health benefits of your rest.

At first, these appear to be a standard set of noise-canceling headphones. The headphones feature a patented fabric technology to ensure they stay cool overnight.

But, where the Kokoon Relax Headphones really capture our interest is in their ability to monitor your sleep. The wearable will track brain waves and your movement during the night to analyze your sleep. The headphones can then use this information to suggest guided meditations, soundscapes, and other personalized insights into your sleep.

FitBit Sense

The next entry on our list is one of the biggest names in wearable fitness technology.

It’s important to remember that a major component of your health is preventive action. Maintaining an active fitness plan is a great way to proactively care for your health, and the FitBit Sense is a great tool to enhance your fitness.

The FitBit will constantly monitor your blood flow and heart rate, helping to track your workout. The wearable can then suggest or recommend fitness plans or guided workouts to match your needs.

Along with heart rate and blood flow, the FitBit sense can also monitor ECG waves, stress, and temperature. Using all this data the device then provides a daily readiness score, which lets you know whether to prioritize exercise or recovery on a given day. The FitBit Sense is one of the most popular pieces of wearable medical technology today.

Wearable BioSensors

While these biosensors are relatively undeveloped compared to other entries on this list, it is likely they will soon be commonplace.

These biosensors will be able to take on a variety of different forms and allow for continuous communication between doctor and patient. The biosensors can track heart rate, blood pressure, ECG waves, and even alert someone in case of a fall.

The potential for these biosensors is nearly limitless and it’s exciting to watch their development.

Wearable ECG and Blood Pressure Monitors

The final entry to our list is the wearable ECG and blood pressure monitors.

These are produced by a variety of health and wellness companies and all are slightly different. However, they all serve the same core function of providing constant monitoring and feedback on your vitals.

These monitors often take the form of a small pack or device that can be worn underneath your clothing. They don’t offer the same versatility as biosensors will, but they provide a valuable function nevertheless.

While these are great examples of wearable medical tech, there are many other options available. We encourage you to do your own research and select an option that works best for you.

The post The Most Popular Wearable Medical Tech appeared first on The Tech Report.

]]>
GC Plus: The Rising ‘Uber’ in the Home Services Industry https://techreport.com/innovation/gc-plus-home-services-industry/ Thu, 19 May 2022 17:45:11 +0000 https://techreport.com/?p=3477644 GC Plus aims to revolutionize home improvement services much the same way Uber did for transportation, but this time for plumbing repair.

By now, nearly everyone is aware of Uber’s enormous impact on the taxi industry. After only a few years of operation, the popular ride-sharing service undoubtedly transformed the sector. The...

The post GC Plus: The Rising ‘Uber’ in the Home Services Industry appeared first on The Tech Report.

]]>
GC Plus aims to revolutionize home improvement services much the same way Uber did for transportation, but this time for plumbing repair.

By now, nearly everyone is aware of Uber’s enormous impact on the taxi industry. After only a few years of operation, the popular ride-sharing service undoubtedly transformed the sector. The service offered lower prices, faster and higher-quality service. Uber also provided a greater degree of transparency in selecting drivers and establishing fees. All this happened before the journey even began! Currently, GC Plus aims to achieve the same thing in many ways, but this time in the home improvement and plumbing repair industries.

So, how will this work? What effects will it have on these industries? Let’s take a quick look.

What exactly is GC Plus?

GC Plus initially started as a traditional plumbing business in Chicago, Illinois. Before long, the team quickly discovered fundamental problems within the industry. Common issues included people living in remote locations not being able to access services and vague pricing. There was also an overall lack of transparency and customer service throughout the process.

Conversely, handymen and plumbers were struggling with issues as well. Independent contractors faced little growth potential, uncertain financial stability, and a lack of support. They were struggling with business-level activities such as marketing and accounting. Even those who worked with franchises struggled with long-term contracts and high start-up costs. These rarely ended up being worth the investment.

With this in mind, GC Plus developed an innovative product. It builds upon the sharing economy using an on-demand service similar to Uber.

Customers are matched with the right technician by answering questions about the service needed. Recommendations are (as you would expect) dependent on their home location. GC Plus utilizes an AI-powered system that processes the client’s answers and location. It then sends requests only to technicians who are qualified to do the job. They must also be able to do it at the customer’s desired time and location. After that, whoever accepts the job first will get the assignment.

Below is a list of some of the main features and benefits of using the GC Plus service.

1. Easily connect with handymen in your area.

It can be tough to get a handyman to come out to your property. This is especially the case if you live in a remote area. Similarly, it’s hard to find help when you have an emergency in the early hours of the morning.

Fortunately, rather than scouring Google for local plumbing professionals, GC Plus makes it simple to connect with plumbers. It doesn’t matter where you are or what the problem is. All you have to do is upload the details of your request to the network. After that, you will be connected with professionals willing to attend to the job.

2. Know the price from the get-go.

Most people can attest to calling out a plumber without really understanding how much the job will cost.

With GC Plus, this sort of financial nightmare becomes a thing of the past. The AI estimating tool auto-generates all prices. It sends out a quote after the customer answers all the required questions and uploads all necessary information.

3. Set up a video call with repair professionals.

GC Plus offers a feature that enables you to connect with professionals via video chat. It doesn’t matter whether it’s for safety and emergency repairs or general upkeep and maintenance.

This is ideal for smaller jobs you can do yourself, yet are unsure of how to perform the job properly. Through the GC Plus service, customers can save time and money. You may not need to call out a professional.

The whole process provides peace of mind. You can be sure that the contractor (or you) will do the job correctly.

What’s next for GC Plus?

GC Plus is planning to gradually expand. They hope to offer other home services in addition to plumbing. The company hopes to add skilled handymen, HVAC, flooring, painting, roofing, landscaping, and other home services to the mix.

Final Thoughts

From taxi services to home food delivery, the on-demand economy has already revolutionized a slew of industries.

In the months and years ahead, GC Plus hopes to do likewise. They seek nothing less than to have the same effect in the plumbing and home improvement sectors. They are looking primarily to provide greater value to clients. The hope is that increased transparency will simultaneously enhance the lives of plumbing professionals.

Much of its success will be determined by how well both customers and professionals adopt the service. However, the value it provides to both parties is clear. Looking forward, there is no reason why it can’t go on to revolutionize these industries.

The post GC Plus: The Rising ‘Uber’ in the Home Services Industry appeared first on The Tech Report.

]]>
Buying a Smart TV on Amazon: Best Options for May 2022 https://techreport.com/gadget-digest/smart-tv-on-amazon/ Tue, 17 May 2022 14:40:40 +0000 https://techreport.com/?p=3477429 Shopping for a smart TV can be difficult, so we've rounded up the 10 best television options on Amazon right now to make the search easier.

With so many newer technology options available today, something as seemingly simple as buying a new smart TV can feel like a stressful, daunting task. From screen size and smart...

The post Buying a Smart TV on Amazon: Best Options for May 2022 appeared first on The Tech Report.

]]>
Shopping for a smart TV can be difficult, so we've rounded up the 10 best television options on Amazon right now to make the search easier.

With so many newer technology options available today, something as seemingly simple as buying a new smart TV can feel like a stressful, daunting task. From screen size and smart TVs to resolution and operating systems, there is so much to parse through before making a decision.

Never fear! We’ve rounded up the seven best TV options on Amazon right now. Hopefully, it will make the search for your next TV easier. (All links and prices are accurate as of the time of publication.)

Philips 4K Android TV

One of the most trusted brands on the market, the Philips 4K Android TV is a fantastic option for those looking for an Android TV with 4K high-definition resolution.

Special features include Bluetooth, ‎Google Assistant, Google Duo — which allows for video calls, and Chromecast — to share your phone or tablet screen to your TV as needed.

This TV also only weighs 20 pounds, which is five pounds lighter than the average 50-inch television, making it a great choice for those who wish to mount their TVs to a wall.

  • This television is available for $388, with the option to add Expert Wall Mounting services for an additional charge.

LG OLED C1 Series Alexa Built-in 4k Smart TV

LG is a strong brand whose smart TVs have a range of sought-after features available. The OLED C1 Series is great for those looking for a TV with Alexa and Google Assistant capabilities.

This TV is great for gamers because the HDMI is great for faster gaming speeds while avoiding lag with its low latency mode options.

For those who want a movie theater experience, there is a Filmmaker Mode option that shows the movie in Cinema HDR and with Dolby Vision IQ and Atmos.

  • This TV is available in 48, 55, 65, 77, and 83 inches and starts at $996.99. You can also add a soundbar to your purchase for an additional charge.

SAMSUNG Class Crystal UHD AU8000 Series

One of the most common television brands, Samsung, is a quality and reliable TV option.

The Class Crystal UHD AU8000 Series has an extremely thin profile and boasts a crystal layer, which makes the colors more powerful. It also comes with a 4K crystal processor that has quick optimization of all 4K content.

Equipped with three HDMI ports to connect multiple devices and multiple voice assistant options, this TV blends seamlessly into the wall.

  • This TV is available in 43, 50, 55, 65, 75, and 85 inches and starts at $499.99. You can also add a soundbar to your purchase for an additional charge.

Amazon Fire TV Omni Series 4K UHD Smart TV

The Amazon Fire TV has 4K Ultra HD, HDR10, hybrid log-gamma, and hands-free access to Alexa. One major plus for this TV is that it supports Dolby Vision, which creates a more robust cinematic experience for viewers.

For those concerned with additional privacy, there are built-in privacy protections that allow you to disconnect the microphones as needed.

  • This television is available in 43, 50, 55, 65, and 75 inches and starts at $409.99. You can also add a wall mount, remote cover, and Expert Wall Mounting services to your purchase for an additional charge.

VIZIO V-Series 4K UHD LED HDR Smart TV

The VIZIO V-Series TV is another great option for gamers. Activate Game Mode to experience better gaming and lower latency for a better gaming experience. This TV has Chromecast and Apple Airplay for flexible watching and communication.

The V-series also works with Alexa, meaning you can pair it with an Alexa device to use voice control.

  • This television is available in 43, 50, 55, 58, 65, 70, and 75 inches and starts at $288. You can also add Expert Wall Mounting services to your purchase for an additional charge.

LG NanoCell 75 Series 4k Smart TV

The LG NanoCell gets its name from its 4K nanocell display, which creates vivid, lifelike colors.

Along with the crisp picture, there are Active HDR and Filmmaker Mode options to experience movies the way film directors want audiences to experience their work.

Nanocell tech also creates a stronger gaming experience and utilizes auto low-latency mode. Because this TV has a built-in Alexa, you can run your smart home using voice commands.

  • This television is available in 43, 50, 55, 65, 75, and 86 inches and starts at $366.99.

Insignia Class F20 Series Smart Fire TV

A smaller option than the TVs listed above, the Insignia Class F20 Series Smart HD 720p Fire TV is great for smaller spaces.

The voice remote uses Alexa, allowing the user to open apps, search, use smart devices, and other programmed voice-activated commands.

  • This television is available in 24, 32, and 29 inches and starts at $99.99 in HD or HD DTS TruSurround. You can also add Expert Wall Mounting services to your purchase for an additional charge.

Buying a New Smart TV: Tailor Your Choice to Your Needs

Shopping for a new television with so many options on the market today can be tough.

Once you’ve decided on which features matter to you most, you can use this list to make a final decision. And if you happen to be an Amazon Prime customer, you could probably have a new TV delivered to your door in two days.

The post Buying a Smart TV on Amazon: Best Options for May 2022 appeared first on The Tech Report.

]]>
AMD Radeon R9 Fury X Graphics Card Reviewed https://techreport.com/review/amds-radeon-r9-fury-x-graphics-card-reviewed/ Mon, 16 May 2022 18:00:00 +0000 http://localhost/wordpress/amds-radeon-r9-fury-x-graphics-card-reviewed Introduction This article is review of the AMD Radeon R9 Fury X. It includes information, pictures, tables, data, game testing, and more. The Review of the AMD Radeon R9 Fury...

The post AMD Radeon R9 Fury X Graphics Card Reviewed appeared first on The Tech Report.

]]>
Introduction

This article is review of the AMD Radeon R9 Fury X. It includes information, pictures, tables, data, game testing, and more.

The Review of the AMD Radeon R9 Fury X

The Fury X is here. At long last, after lots of hype, we can show you how AMD’s new high-end GPU performs aboard the firm’s snazzy new liquid-cooled graphics card. We’ve tested in a range of games using our famous frame-time-based metrics, and we have a full set of results to share with you. Let’s get to it.

A brief stop in Fiji

Over the past several weeks, almost everything most folks would want to know about the new Radeon GPU has become public knowledge—except for how it performs. If you’ve somehow missed out on this info, let’s take a moment to summarize. At the heart of the Radeon R9 Fury X are two new core technologies: the Fiji graphics processor and a new type of memory known as High Bandwidth Memory (HBM).

The Fiji GPU is AMD’s first new top-end GPU in nearly two years, and it’s the largest chip in a family of products based on the GCN architecture that stretches back to 2011. Even the Xbox One and PS4 are based on GCN, although Fiji is an evolved version of that technology built on a whole heck of a lot larger scale. Here are its vitals compared to the biggest PC GPUs, including the Hawaii chip from the Radeon R9 290X and the GM200 from the GeForce GTX 980 Ti.

ROP
pixels/
clock
Texels
filtered/
clock
(int/fp16)
Shader
processors
Rasterized
triangles/
clock
Memory
interface
width (bits)
Estimated
transistor
count
(Millions)
Die size
(mm²)
Fab
process
GM200 96 192/192 3072 6 384 8000 601 28 nm
Hawaii 64 176/88 2816 4 512 6200 438 28 nm
Fiji 64 256/128 4096 4 4096 8900 596 28 nm

Yep, Fiji’s shader array has a massive 4096 ALU lanes or “shader processors,” more than any other GPU to date. To give you some context for these numbers, once you factor in clock speeds, the Radeon R9 Fury X has seven times the shader processing power of the Xbox One and over seven times the memory bandwidth. Even a block diagram of Fiji looks daunting.


A simplified block diagram of a Fiji GPU. Source: AMD.

In many respects, Fiji is just what you see above: a larger implementation of the same GCN architecture that we’ve known for several years. AMD has made some important improvements under the covers, though. Notably, Fiji inherits a delta-based color compression facility from last year’s Tonga chip. This feature should allow the GPU to use its memory bandwidth and capacity more efficiently than older GPUs like Hawaii. Many of the other changes in Fiji are meant to reduce power consumption. A feature called voltage-adaptive operation, first used in AMD’s Kaveri and Carrizo APUs, should allow the chip to run at lower voltages, reducing power draw. New methods for selecting voltage and clock speed combinations and switching between those different modes should make Fiji more efficient than older GCN graphics chips, too. (For more info on Fiji’s graphics architecture, be sure to read my separate article on the subject.)

This combination of increased scale and reduced power consumption allows Fiji to cram about 45% more processing power into roughly the same power envelope as Hawaii before it. Yet even that fact isn’t Fiji’s most notable attribute. Instead, Fiji’s signature innovation is HBM, the first new type of high-end graphics memory introduced in seven years. HBM takes advantage of a technique in chip-building technology known as stacking, in which multiple silicon dies are piled on top of one another in order to improve the bit density. We’ve seen stacking deployed in the flash memory used in SSDs, but HBM is perhaps even more ambitious. And Fiji is the first commercial implementation of this tech.


A simplified illustration of an HBM solution. Source: AMD.

The Fiji GPU sits atop a piece of silicon, known as an interposer, along with four stacks of HBM memory. The individual memory chips run at a relatively sedate speed of 500MHz in order to save power, but each stack has an extra-wide 1024-bit connection to the outside world in order to provide lots of bandwidth. This “wide and slow” setup works because the GPU and memory get to talk to one another over the silicon interposer, which is the next best thing to having memory integrated on-chip.

With four HBM stacks, Fiji effectively has a 4096-bit-wide path to memory. That memory transfers data at a rate of 1Gbps, yielding a heart-stopping total of 512 GB/s of bandwidth. The Fury X’s closest competitor, the GeForce GTX 980 Ti, tops out at 336 GB/s, so the new Radeon represents a substantial advance.

HBM also saves power, both on the DRAM side and in the GPU’s memory control logic, and it enables an entire GPU-plus-memory solution to fit into a much smaller physical footprint. Fiji with HBM requires about one-third the space of Hawaii and its GDDR5, as illustrated above.

This very first implementation of HBM does come with one potential drawback: it’s limited to a total of 4GB of memory capacity. Today’s high-end cards, including the R9 Fury X, are marketed heavily for use with 4K displays. That 4GB capacity limit could perhaps lead to performance issues, especially at very high resolutions. AMD doesn’t seem to think it will be a problem, though, and, well, you’ll see our first round of performance numbers shortly.

The Radeon R9 Fury X card

Frankly, I think most discussions of the physical aspects of a graphics card are horribly boring compared to the GPU architecture stuff. I’ll make an exception for the Fury X, though, because this card truly is different from the usual fare in some pretty dramatic ways.

GPU
boost
clock
Shader
processors
Memory
config
PCIe
aux
power
Peak
power
draw
E-tail
price
Radeon
R9 Fury X
1050 MHz 4096 4 GB HBM 2 x 8-pin 275W $649.99

The differences start with the card itself, which is a stubby 7.7″ long and has an umbilical cord running out of its belly toward an external water cooler. You can expect this distinctive layout from all Fury X cards, because AMD has imposed tight controls for this product. Board makers won’t be free to tweak clock speeds or to supply custom cooling for the Fury X.

Instead, custom cards will be the domain of the vanilla Radeon R9 Fury, due in mid-July at prices starting around $550. The R9 Fury’s GPU resources will be trimmed somewhat compared to the Fury X, and customized boards and cooling will be the norm for it. AMD tells us to expect some liquid-cooled versions of the Fury and others with conventional air coolers.

Few of those cards are likely to outshine the Fury X, though, because video card components don’t get much more premium than these. The cooling shroud’s frame is encased in nickel-plated chrome, and the black surfaces are aluminum plates with a soft-touch coating. The largest of these plates, covering the top of the card in the picture above, can be removed with the extraction of four small hex screws. AMD hopes end-users will experiment with creating custom tops via 3D printing.

I’m now wondering if that liquid cooler could also keep a beer chilled if I printed a cup-holder attachment. Hmm.

The Fury X’s array of outputs is relatively spartan, with three DisplayPort 1.2 outputs and only a single HDMI 1.4 port. HDMI 2.0 support is absent, which means the Fury X won’t be able to drive most cheap 4K TVs at 60Hz. You’re stuck with DisplayPort if you want to do proper 4K gaming. Also missing, though perhaps less notable, is a DVI port. That omission may sting a little for folks who own big DVI displays, but DisplayPort-to-DVI adapters are pretty widely available. AMD is sending a message with this choice of outputs: the Fury X is about gaming in 4K, with FreeSync at high refresh rates, and on multiple monitors. In fact, this card can drive as many as six displays with the help of a DisplayPort hub.

Here’s a look beneath the shroud. The Fury X’s liquid cooler is made by Cooler Master, as the logo atop the water block proclaims. This block sits above the GPU and the HBM stacks, pulling heat from all of the chips.

AMD’s decision to make liquid cooling the stock solution on the Fury X is intriguing. According to Graphics CTO Raja Koduri, the firm found that consumers want liquid cooling, as evidenced by the fact that they often wind up paying extra for aftermarket kits. This cooler does seem like a nice inclusion, something that enhances the Fury X’s value, provided that the end user has an open emplacement in his or her case for a 120-mm fan and radiator. Sadly, I don’t think the new Damagebox has room for another radiator, since I already have one installed for the CPU.

The cooler in the Fury X is tuned to keep the GPU at a frosty 52°C, well below the 80-90°C range we’re used to seeing from stock coolers. The card is still very quiet in active use despite the aggressive temperature tuning, probably because the cooler is rated to remove up to 500W of heat. Those chilly temps aren’t just for fun, though. At this lower operating temperature, the Fiji GPU’s transistors shouldn’t be as leaky. The chip should convert less power into heat, thus improving the card’s overall efficiency. The liquid cooler probably also helps alleviate power density issues, which may have been the cause of the R9 290X’s teething problems with AMD’s reference air coolers.

That beefy cooler should help with overclocking, of course, and the Fury X’s power delivery circuitry has plenty of built-in headroom, too. The card’s six-phase power can supply up to 400 amps, well above the 200-250 amps that the firm says is needed for regular operation. The hard limit in the BIOS for GPU power is 300W, which adds up to 375W of total power board power draw. That’s 100W beyond the Fury X’s default limit of 275W.

Facilitating Overclocking

To better facilitate overclocking, the Catalyst Control Center now exposes separate sliders for the GPU’s clock speed, power limit, temperature, and maximum fan speed. Users can direct AMD’s PowerTune algorithm to seek the mix of acoustics and performance they prefer.

Despite its many virtues, our Fury X review unit does have one rather obvious drawback. Whenever it’s powered on, whether busy or idle, the card emits a constant, high-pitched whine. It’s not the usual burble of pump noise, the whoosh of a fan, or the irregular chatter of coil whine—just an unceasing squeal like an old CRT display might emit. The noise isn’t loud enough to register on our sound level meter, but it is easy enough to hear. The sound comes from the card proper, not from the radiator or fan. An informal survey of other reviewers suggests our card may not be alone in emitting this noise. I asked AMD about this matter, and they issued this statement:

AMD received feedback that during open bench testing some cards emit a mild “whining” noise. This is normal for most high speed liquid cooling pumps; Usually the end user cannot hear the noise as the pumps are installed in the chassis, and the radiator fan is louder than the pump. Since the AMD Radeon R9 FuryX radiator fan is near silent, this pump noise is more noticeable.

The issue is limited to a very small batch of initial production samples and we have worked with the manufacturer to improve the acoustic profile of the pump. This problem has been resolved and a fix added to production parts and is not an issue.

That’s reassuring—I think. I’ve asked AMD to send us a production sample so we can verify that retail units don’t generate this noise.

Fury X cards have one more bit of bling that’s not apparent in the pictures above: die blikenlights. Specifically, the Radeon logo atop the cooler glows deep red. (The picture above lies. It’s stoplight red, honest.) Also, a row of LEDs next to the power plugs serves as a GPU tachometer, indicating how busy the GPU happens to be.

These lights are red by default, but they can be adjusted via a pair of teeny-tiny DIP switches on the back of the card. The options are: red tach lights, blue tach lights, red and blue tach lights, and tach lights disabled. There’s also a green LED that indicates when the card has dropped into ZeroCore power mode, the power-saving mode activated when the display goes to sleep.

Speaking of going to sleep, that’s what I’m gonna do if we don’t move on to the performance results. Let’s do it.

Game Testing

Most of the numbers you’ll see on the following pages were captured with Fraps, a software tool that can record the rendering time for each frame of animation. We sometimes use a tool called FCAT to capture exactly when each frame was delivered to the display, but that’s usually not necessary in order to get good data with single-GPU setups. We have, however, filtered our Fraps results using a three-frame moving average. This filter should account for the effect of the three-frame submission queue in Direct3D. If you see a frame time spike in our results, it’s likely a delay that would affect when the frame reaches the display.

We didn’t use Fraps with Civ: Beyond Earth or Battlefield 4. Instead, we captured frame times directly from the game engines using the games’ built-in tools. We didn’t use our low-pass filter on those results.

As ever, we did our best to deliver clean benchmark numbers. Our test systems were configured like so:

Processor Core i7-5960X
Motherboard Gigabyte
X99-UD5 WiFi
Chipset Intel X99
Memory size 16GB (4 DIMMs)
Memory type Corsair
Vengeance LPX
DDR4 SDRAM at 2133 MT/s
Memory timings 15-15-15-36
2T
Chipset drivers INF update
10.0.20.0
Rapid Storage Technology Enterprise 13.1.0.1058
Audio Integrated
X79/ALC898
with Realtek 6.0.1.7246 drivers
Hard drive Kingston
SSDNow 310 960GB SATA
Power supply Corsair
AX850
OS Windows
8.1 Pro
Driver
revision
GPU
base
core clock
(MHz)
GPU
boost
clock
(MHz)
Memory
clock
(MHz)
Memory
size
(MB)
Asus
Radeon
R9 290X
Catalyst 15.4/15.5
betas
1050 1350 4096
Radeon
R9 295 X2
Catalyst 15.4/15.5
betas
1018 1250 8192
Radeon
R9 Fury X
Catalyst 15.15
beta
1050 500 4096
GeForce
GTX 780 Ti
GeForce 352.90 876 928 1750 3072
Gigabyte
GeForce GTX 980
GeForce 352.90 1228 1329 1753 4096

GeForce GTX 980 Ti
GeForce
352.90
1002 1076 1753 6144
GeForce
Titan X
GeForce 352.90 1002 1076 1753 12288

Thanks to Intel, Corsair, Kingston, and Gigabyte for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Also, our FCAT video capture and analysis rig has some pretty demanding storage requirements. For it, Corsair has provided four 256GB Neutron SSDs, which we’ve assembled into a RAID 0 array for our primary capture storage device. When that array fills up, we copy the captured videos to our RAID 1 array, comprised of a pair of 4TB Black hard drives provided by WD.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Sizing ’em up

Do the math involving the clock speeds and per-clock potency of the latest high-end graphics cards, and you’ll end up with a comparative table that looks something like this:

Peak pixel
fill rate
(Gpixels/s)
Peak
bilinear
filtering
int8/fp16
(Gtexels/s)
Peak
rasterization
rate
(Gtris/s)
Peak
shader
arithmetic
rate
(tflops)
Memory
bandwidth
(GB/s)
Asus R9 290X 67 185/92 4.2 5.9 346
Radeon R9 Fury X 67 269/134 4.2 8.6 512
GeForce GTX 780 Ti 37 223/223 4.6 5.3 336
Gigabyte GTX 980 85 170/170 5.3 5.4 224
GeForce GTX 980 Ti 95 189/189 6.5 6.1 336
GeForce Titan X 103 206/206 6.5 6.6 336

Those are the peak capabilities of each of these cards in theory. As I noted in my article on the Fiji GPU architecture, the Fury X is particularly strong in several departments, including memory bandwidth and shader rates, where it substantially outstrips both the R9 290X and the competing GeForce GTX 980 Ti. In other areas, the Fury X’s theoretical graphics rates haven’t budged compared to the 290X, including the pixel fill rate and rasterization. Those are also precisely the areas where the Fury X looks weakest compared to the competition. We are looking at a bit of asymmetrical warfare this time around, with AMD and Nvidia fielding vastly different mixes of GPU resources in similarly priced products.

Of course, those are just theoretical peak rates. Our fancy Beyond3D GPU architecture suite measures true delivered performance using a series of directed tests.

The Fiji GPU has the same 64 pixels per clock of ROP throughput as Hawaii before it, so these results shouldn’t come as a surprise. These numbers illustrate something noteworthy, though. Nvidia has grown the ROP counts substantially in its Maxwell-based GPUs, taking even the mid-range GM204 aboard the GTX 980 beyond what Hawaii and Fiji offer. Truth be told, both of the Radeons probably offer more than enough raw pixel fill rate. However, these results are a sort of proxy for other types of ROP power, like blending for multisampled anti-aliasing and Z/stencil work for shadowing, that can tax a GPU.

This bandwidth test measures GPU throughput using two different textures: an all-black surface that’s easily compressed and a random-colored texture that’s essentially incompressible. The Fury X’s results demonstrate several things of note.

The 16% delta between the black and random textures shows us that Fiji’s delta-based color compression does it some good, although evidently not as much good as the color compression does on the Maxwell-based GeForces.

Also, our understanding from past reviews was that the R9 290X was limited by ROP throughput in this test. Somehow, the Fury X speeds past the 290X despite having the same ROP count on paper. Hmm. Perhaps we were wrong about what limited the 290X. If so, then 290X may have been bandwidth limited, after all—and Hawaii apparently has no texture compression of note. The question then becomes whether the Fury X is also bandwidth limited in this test, or if its performance is limited by the render back-end. Whatever the case, the Fury X “only” achieves 387 GB/s of throughput here, well below the 512 GB/s theoretical max of its HBM-infused memory subsystem. Ominously, the Fury X only leads the GTX 980 Ti by the slimmest of margins with the compressible black texture.

Fiji has a ton of texture filtering capacity on tap, especially for simpler formats. The Fury X falls behind the GTX 980 Ti when filtering texture formats that are 16 bits per color channel, though. That fact will matter more or less depending on the texture formats used by the game being run.

The Fury X achieves something close to its maximum theoretical rate in our polygon throughput test, at least when the polygons are presented in a list format. However, it still trails even the Kepler-based GeForce GTX 780 Ti, let alone the newer GeForces. Adding tessellation to the mix doesn’t help matters. The Fury X still manages just over half the throughput of the GTX 980 Ti in TessMark.

Fiji’s massive shader array is not to be denied. The Fury X crunches through its full 8.6 teraflops of theoretical peak performance in our ALU throughput test.

At the end of the day, the results from these directed tests largely confirm the major contrasts between the Fury X and the GeForce GTX 980 Ti. These two solutions have sharply divergent mixes of resources on tap, not just on paper but in terms of measurable throughput.

Project Cars

Project Cars is beautiful. I could race around Road America in a Formula C car for hours and be thoroughly entertained, too. In fact, that’s pretty much what I did in order to test these graphics cards.


Click the buttons above to cycle through the plots. What you’ll see are frame times are from one of the three test runs we conducted for each card. You’ll notice that PC graphics cards don’t always produce smoothly flowing progressions of succeeding frames of animation, as the term “frames per second” would seem to suggest. Instead, the frame time distribution is a hairy, fickle beast that may vary widely. That’s why we capture rendering times for every frame of animation—so we can better understand the experience offered by each solution.

The Fury X’s bright red plot indicates consistently lower frame times than the R9 290X’s purple plot. The dual-GPU R9 295 X2 often produces even lower frame times than the Fury X, but there’s a nasty spike near the middle of the test. That’s a slowdown that you can feel while gaming in the form of a stutter. The Fury X avoids that fate, and playing Project Cars on it generally feels smooth as a result.

Unfortunately for the red team, the Fury X doesn’t crank out frames as quickly as the GeForce GTX 980 Ti. The 980 Ti produces more frames over the course of the test run, so naturally, its FPS average is higher.

Frame time
in milliseconds
FPS
rate
8.3 120
16.7 60
20 50
25 40
33.3 30
50 20

Higher averages aren’t always an indicator of smoother overall animation, though. Remember, we saw a big spike in the 295 X2’s plot. Even though its FPS average is higher than the Fury X’s, gameplay on the the 295 X2 isn’t as consistently smooth. That’s why we prefer to supplement average FPS with another metric: 99th percentile frame time. This metric simply says “99% of all frames in this test run were produced in X milliseconds or less.” The lower that threshold, the better the general performance. In this frame-time-focused metric, the Fury X just matches the 295 X2, despite a lower FPS average.

Almost all of the cards handle this challenge pretty well, considering that we’re asking them to render in 4K at fairly high image quality settings. All but one of them manage a 99th percentile frame time of below 33 milliseconds. That means, on a per-frame basis, they perform at or above 30 FPS 99% of the time.

We can understand in-game animation fluidity even better by looking at the entire “tail” of the frame time distribution for each card, which illustrates what happens with the most difficult frames.


These curves show generally consistent performance from nearly all of the cards, with the lone exception of the Radeon R9 295 X2. That card struggles with toughest three percent of frames, and a result, the line for this dual-Hawaii card curves up to meet the one for the single-Hawaii 290X. These are the dangers of multi-GPU solutions.


These “time spent beyond X” graphs are meant to show “badness,” those instances where animation may be less than fluid—or at least less than perfect. The 50-ms threshold is the most notable one, since it corresponds to a 20-FPS average. We figure if you’re not rendering any faster than 20 FPS, even for a moment, then the user is likely to perceive a slowdown. 33 ms correlates to 30 FPS or a 30Hz refresh rate. Go beyond that with vsync on, and you’re into the bad voodoo of quantization slowdowns. And 16.7 ms correlates to 60 FPS, that golden mark that we’d like to achieve (or surpass) for each and every frame.

The frame time plots have few big spikes in them, and the FPS averages here are well above 20. As a result, none of the cards spend any time beyond our 50-ms threshold. Even the 295 X2, which has one spike beyond 50 ms in the frame time plots, doesn’t repeat a slowdown of the same magnitude in the other three test runs. (These results are a median of three runs.)

The Fury X essentially spends no time beyond our 33-ms threshold, either. Like I said, it generally feels pretty good to play this game on it in 4K. Trouble is, the new Radeon falls behind three different GeForces, including the cheaper GTX 980, across a range of metrics. Perhaps the next game will be a different story.

The Witcher 3

Performance in this game has been the subject of some contention, so I tried to be judicious in selecting my test settings. I tested the older Radeons with the Catalyst 15.5 beta drivers here (and 15.15 on the Fury X), and all cards were tested with the latest 1.04 patch for the game. Following AMD’s recommendations for achieving good CrossFire performance, I set “EnableTemporalAA=false” in the game’s config file when testing the Radeon R9 295 X2. As you’ll see below, I also disabled Nvidia’s HairWorks entirely in order to avoid the associated performance pitfalls.


You can tell by the “fuzzy” frame-time plots that the Radeons struggle in this game. That’s particularly an issue when the frame time spikes get to be fairly high—into the 40-60-ms range. The Fury X trails the GTX 980 Ti in the FPS average results, but it falls even further behind in the 99th-percentile frame time metric. This outcome quantifies something you can feel during the test run: the animation hiccups and sputters much more than it should, especially in the early part of the test sequence.

The GeForce GTX 780 Ti struggles here, too. Since we tested, Nvidia has released new drivers that may improve the performance of Kepler-based cards like this one. My limited time with the Radeon Fury X has been very busy, however, so I wasn’t able to re-test the GTX 780 Ti with new drivers for this review.



In our “badness” metric, both the Fury X and the R9 290X spend about the same amount of time beyond the 50-ms threshold—not a ton, but enough that one can feel it. The fact that these two cards perform similarly here suggests the problem may be a software issue gated by CPU execution speed.

Despite those hiccups, the Fury X generally improves on the 290X’s performance, which is a reminder of the Fiji GPU’s tremendous potential.

GTA V

Forgive me for the massive number of screenshots below, but GTA V has a ton of image quality settings. I more or less cranked them all up in order to stress these high-end video cards. Truth be told, most or all of these cards can run GTA V quite fluidly at lower settings in 4K—and it still looks quite nice. You don’t need a $500+ graphics card to get solid performance from this game in 4K, not unless you push all the quality sliders to the right.




No matter how you slice it, the Fury X handles GTA V in 4K quite nicely. The 99th-percentile results track with the FPS results, which is what happens when the frame time plots are generally nice and flat. Again, though, the GTX 980 Ti proves to be measurably faster than the Fury X.

Far Cry 4


At last, we have a game where the Fury X beats the GTX 980 Ti in terms of average FPS. Frustratingly, though, a small collection of high frame times means the Fury X falls behind the GeForce in our 99th-percentile metric.



The similarities between the Fury X and the 290X in our “badness” metric might suggest some common limitation in that handful of most difficult frames.

Whatever the case, playing on the GeForce is smoother, although the Fury X’s higher FPS average suggests it has more potential.

Alien: Isolation




In every metric we have, the Fury X is situated just between the GTX 980 and the 980 Ti in this game. All of these cards are very much competent to play Alien Isolation fluidly in 4K.

Civilization: Beyond Earth

Since this game’s built-in benchmark simply spits out frame times, we were able to give it a full workup without having to resort to manual testing. That’s nice, since manual benchmarking of an RTS with zoom is kind of a nightmare.

Oh, and the Radeons were tested with the Mantle API instead of Direct3D. Only seemed fair, since the game supports it.




This is a tight one, but the GTX 980 Ti manages to come out on top overall by a whisker. For all intents, though, the Fury X and 980 Ti perform equivalently here.

Battlefield 4

Initially, I tested BF4 on the Radeons using the Mantle API, since it was available. Oddly enough, the Fury X’s performance was kind of lackluster with Mantle, so I tried switching over to Direct3D for that card. Doing so boosted performance from about 32 FPS to 40 FPS. The results below for the Fury X come from D3D.




The Fury X trades blows with the GeForce GTX 980 in BF4. The new Radeon’s performance is fairly solid, but it’s just not as quick as the GTX 980 Ti.

Crysis 3


Ack. The Fury X looks competitive with the GTX 980 Ti in the FPS sweeps, but it drops down the rankings in our 99th-percentile frame time measure. Why?

Take a look at the frame time plot, and that one spot in particular where frame times jump to over 120 milliseconds. This slowdown happens at the point in our test run where there’s an explosion with lots of particles on the screen. There are smaller spikes on the older Radeons, but nothing like we see from the Fury X. This problem is consistent across multiple test runs, and it’s not subtle. Here’s hoping AMD can fix this issue in its drivers.



Our “badness” metric at 50 ms picks up those slowdowns on the Fury X. This problem mars what would otherwise be a very competitive showing versus the 980 Ti.

 

 

Power consumption

Please note that our “under load” tests aren’t conducted in an absolute peak scenario. Instead, we have the cards running a real game, Crysis 3, in order to show us power draw with a more typical workload.

In the Fury X, AMD has managed to deliver a substantial upgrade in GPU performance over the R9 290X with lower power draw while gaming. That’s impressive, especially since the two GPUs are made on the same 28-nm process technology. The Fury X still requires about 50W more power than the GTX 980 Ti, but since its liquid cooler expels heat directly out of the PC case, I’m not especially hung up on that fact. GCN-based GPUs still aren’t as power-efficient as Nvidia’s Maxwell chips, but AMD has just made a big stride in the right direction.

Noise levels and GPU temperatures

These video card coolers are so good, they’re causing us testing problems. You see, the noise floor in Damage Labs is about 35-36 dBA. It varies depending on things I can’t quite pinpoint, but one notable contributor is the noise produced by the lone cooling fan always spinning on our test rig, the 120-mm fan on the CPU cooler. Anyhow, what you need to know is that any of the noise results that range below 36 dBA are running into the limits of what we can test accurately. Don’t make too much of differences below that level.

The Fury X’s liquid cooler lives up to its billing with a performance that’s unquestionably superior to anything else we tested. You will have to find room for the radiator in your case, though. In return, you will get incredibly effective cooling at whisper-quiet noise levels.

 

 

Conclusions and Closing Thoughts

As usual, we’ll sum up our test results with a couple of value scatter plots. The best values tend toward the upper left corner of each plot, where performance is highest and prices are lowest. We’ve converted our 99th-percentile frame time results into FPS, so that higher is better, in order to make this layout work.


If you’ve been paying attention over the preceding pages, you pretty much know the story told by our FPS value scatter. The Radeon R9 Fury X is a big advance over the last-gen R9 290X, and it’s a close match overall for the GeForce GTX 980 Ti. However, the GeForce generally outperforms the Fury X across our suite of games—by under 10%, or four FPS, on average. That’s massive progress from the red team, and it’s a shame the Fury X’s measurably superior shader array and prodigious memory bandwidth don’t have a bigger payoff in today’s games.

Costs of AMD Radeon R9 Fury X

Speaking of which, if you dig deeper using our frame-time-focused performance metrics—or just flip over to the 99th-percentile scatter plot above—you’ll find that the Fury X struggles to live up to its considerable potential. Unfortunate slowdowns in games like The Witcher 3 and Far Cry 4 drag the Fury X’s overall score below that of the less expensive GeForce GTX 980. What’s important to note in this context is that these scores aren’t just numbers. They mean that you’ll generally experience smoother gameplay in 4K with a $499 GeForce GTX 980 than with a $649 Fury X. Our seat-of-the-pants impressions while play-testing confirm it. The good news is that we’ve seen AMD fix problems like these in the past with driver updates, and I don’t doubt that’s a possibility in this case. There’s much work to be done, though.

Assuming AMD can fix the problems we’ve identified with a driver update, and assuming it really has ironed out the issue with the whiny water pumps, there’s much to like about the Fury X. The GPU has the potential to enable truly smooth gaming in 4K. AMD has managed to keep power consumption in check. The card’s cooling and physical design are excellent; they’ve raised the standard for products of this class. Now that I’ve used the Fury X, I would have a hard time forking over nearly 700 bucks for a card with a standard air cooler. At this price, decent liquid cooling at least ought to be an option.

Also, although we have yet to perform tests intended to tease out any snags, we’ve seen no clear evidence that the Fury X’s 4GB memory capacity creates problems in typical use. We will have to push a little and see what happens, but our experience so far suggests this worry may be a non-issue.

The question now is whether AMD has done enough to win back the sort of customers who are willing to pony up $650 for a graphics card. My sense is that a lot of folks will find the Fury X’s basic proposition and design attractive, but right now, this product probably needs some TLC in the driver department before it becomes truly compelling. AMD doesn’t have to beat Nvidia outright in order to recover some market share, but it does need to make sure its customers can enjoy their games without unnecessary hiccups.

Enjoy our work? Pay what you want to subscribe and support us.

The post AMD Radeon R9 Fury X Graphics Card Reviewed appeared first on The Tech Report.

]]>
Logitech G935 Wireless Gaming Headset Reviewed https://techreport.com/review/logitech-g935-7-1-wireless-gaming-headset-reviewed/ Tue, 10 May 2022 16:00:00 +0000 http://localhost/wordpress/logitech-g935-7-1-wireless-gaming-headset-reviewed Logitech G935 Wireless Gaming Headset

Introduction This article is a review of the Logitech G935 wireless gaming headset. It includes figures, pictures, cost information, comparisons, and more. Check Logitech for more information. The Review of...

The post Logitech G935 Wireless Gaming Headset Reviewed appeared first on The Tech Report.

]]>
Logitech G935 Wireless Gaming Headset

Introduction

This article is a review of the Logitech G935 wireless gaming headset. It includes figures, pictures, cost information, comparisons, and more. Check Logitech for more information.

The Review of the Logitech G935 Wireless Gaming Headset

Even as recently as a few years ago, the gaming headset market was still figuring out what people wanted in a headset. These days, though, the feature set is pretty well solidified, which can make it hard for each headset to stand out. The differences often come in the particular company’s build quality or look of the hardware, the software they provide, and of course the basic sound quality the headset can muster. Logitech’s G935 wireless gaming headset is, in many ways, exactly what you’d expect from Logitech. It offers a clean look clad in lots of plastic, plenty of customization through Logitech’s G HUB software, and a lot of ways to use the gear.

If you’ve worn Logitech’s previous high-end gaming headset, the G933, you have a fairly good idea of what you’re getting into here. At a glance, the two headsets look all but identical. That’s okay, because there are a ton of great features that make the G935 worth a look on its own at the $179 price Logitech is asking.

Build, Style & Fit

Like the G933, the G935 is a mostly plastic affair; the only visible metal is the flexible stuff in the headband. The styling is a little different, but I literally had to compare the two side by side to notice.


Left: Logitech G933; Right: Logitech G935.

But that’s okay, because they look fine. The G935’s aesthetics are about what I’d expect in a gaming headset, if a bit subdued. The RGB LED lighting is basically identical to the G933, as is the shape of the boom mic.

Even the design inside the ear cups is the same. Inside the left earcup is a nice storage spot for the USB dongle, while the right earcup houses a rechargeable, replaceable Lithium-Ion battery.

The left ear also houses all the inputs and buttons. There are jacks for a 3.5mm audio cable (included), a micro USB cable for charging, four buttons, a power switch, and a volume knob. Three of the four buttons are programmable, while one is a dedicated mute button. That last one seems a little superfluous given that the mic features the “lift to mute” functionality that I’ve come to expect from powered headsets. Why build it in twice?

Instead of the sport mesh that we’ve seen on lots of previous Logitech headset earcups, the G935 has a faux leatherette type of material. The earcup material is replaceable, though the set doesn’t come with any extra cups. The headband cushion is made from the same material.

I found that wearing the G935 for extended periods is mostly very comfortable. Even as I’m writing this, they’re sitting on my head pouring Metallica into my ears, and they have been for hours. But they’re not my favorite, even among Logitech headsets. Logitech’s G533 and G Pro headsets both have more even head pressure and stand up better to head movement. The G935 will handle regular gaming movement just fine without budging, but if you do the old wet-dog headshake, the G935 starts moving sooner than the G533. The head pressure thing is going to be very subjective. If you have a slightly smaller head, the 533 might end up feeling loose.

 

Features, sound, and conclusion

One of the most important features in a high-end headset, for me at least, is that 3.5mm analog jack. While there are doubtlessly many gamers out there who play exclusively on PC, personally I switch between PC and console on a daily basis. For me, leaving this feature off a headset at this price point is inexcusable, but it happens enough that Logitech’s inclusion here is still laudable.

The biggest upgrade from the G933 though, is the switch from 40mm to 50mm drivers in each ear. Aside from obviously being 20% bigger, Logitech says it has redesigned the entire casing around the driver and claims the new driver cuts down on low-frequency distortion. We’ll get into how it sounds down below, though.

Another big feature for the G935 is the addition of DTS Headphone:X 2.0. The first revision of Headphone:X made surround sound possible with two-driver headphones, but the 2.0 version allows game creators to put sounds anywhere, not just where surround sound speakers would generally be placed.

The advertised battery life is eight hours with the RGB LED lighting on and 12 hours with it off. This matches up well with my experience with the headphones. I went through about two full charges while testing these for gaming with the lighting on, and 16 hours total is about right.

How it sounds

When we’re looking at a gaming headset rather than a pair of headphones, there are two distinct sound profiles (gaming and music) to consider, and two modes (wired and wireless).

On the gaming front, the DTS Headphone:X 2.0 really does make a difference. I played about eight hours of EA’s Anthem demo and a few hours of my one true gaming love, Doom. Because Anthem was an all-new experience, it’s a little harder to gauge the difference that the G935 made. In Doom, though, I was impressed with how well the surround sound works. I noticed noises I hadn’t heard before, and it was helping me catch enemies at my back before they got me. The audio feels clean and clear, and the ability to adjust EQ through the G HUB software means that just about anyone should be able to find a sound profile that works for them.

Talking over Discord using the wireless connection, I was pleased with how clear my chat partners’ speech came through, and they were especially impressed with how I sounded.

And that’s because the results were a little less impressive when I plugged the headset in. In addition to my time gaming on PC, I spent a good chunk of time playing Sea of Thieves on Xbox One X, in multiplayer the whole time and with the headset plugged into the controller.

Game audio was acceptable. Here, the headphones simply act as 2.0 stereo cans, and they do a fine job of that. Where they fell short was in delivering my voice to my players. I have a couple friends I regularly play this game with. Normally when I play with them, I use Logitech’s G Pro headset—a personal favorite because of its light feel, solid mic, and the excellent nylon-wrapped 3.5mm audio cable it comes with. The G935 has a shorter, rubberized cable that’s acceptable but has all the problems that come with rubberized cables. It catches on things like zippers and has some (though not much) shape memory. It’s not the worst, but I’m a little disappointed that the nylon cable didn’t make the jump.

But back to the sound: My shipmates noticed right away that I was on a different headset than usual. “You sound weird, did you change headsets?” they asked. To troubleshoot a little bit, I switched to Logitech’s also-new G432 and then to the G Pro. Both of those received high marks from my crew, but switching back to the G935 got the same lackluster result as before. I spent my time in Anthem with one of the same players. Thus, we had the same set of ears, using the same headphones, listening to me on three different headsets across two different gaming devices and networks, and the G935’s Xbox performance stood out in a negative way. Given that the other two headsets sounded fine on the Xbox One X, it doesn’t seem like Xbox Live is the problem. The same headset sounded better on the PC than on the Xbox, so it’s not the headset mic itself. It seems like something about plugging in with the wired connection with this headset was the cause of the drop in audio quality. This is total conjecture, but my guess is that something bandwidth-related is getting in the way to make the audio quality drop.


Logitech G935 microphone

As a set of music headphones, the G935 performs pretty well for a gaming headset, but personally, I still prefer my ATH-M50x headphones. It’s worth keeping in mind, though, that they go for just $25 less and don’t feature wireless technology, a microphone, a battery, LED lighting, or 7.1 surround sound. I’ll admit some personal bias toward dedicated music headphones there. My M50x headphones get louder and sound clearer than the G935. By comparison, the G935 feels like offers less instrument separation, and in the default configuration it leans heavier on bass. Metallica’s “Of Wolf and Man” didn’t have the punch I expect from the drums, for example. Miles Davis’ “Blue in Green” felt subdued.

I still found the sound acceptable, though, and I don’t think the differences will stand out unless you’re the kind of person who owns music headphones and has strong opinions about them.

In conclusion: Logitech G935 Wireless Gaming Headset

The only big knocks I can offer against the G935 is the head pressure and console/wired sound quality. In every other department, it’s a rockstar. It offers solid battery life and checks all those must-have boxes for a pricey gaming headset: 3.5mm connectivity (even if it’s subpar), lift-to-mute mic, surround sound, and EQ functionality. DTS Headphone:X 2.0 feels like a substantial upgrade over Headphone:X, too.

If your primary gaming platform is a console or mobile phone, the G935 might not be the best option, but if you plug into your PC to get your game on, the G935 earns every cent of the $179 price tag.

The post Logitech G935 Wireless Gaming Headset Reviewed appeared first on The Tech Report.

]]>
Crypto.com: Four Pros and Cons https://techreport.com/software-news/crypto-com-pros-and-cons/ Tue, 10 May 2022 14:45:26 +0000 https://techreport.com/?p=3477205 Today, Crypto.com has more than ten million users globally...and continues to grow. Here are a few pros and cons of using Crypto.com.

As the cryptocurrency world expands, people are turning to various websites for all of their crypto needs. Based in Singapore and founded in 2016, Crypto.com has more than 10 million...

The post Crypto.com: Four Pros and Cons appeared first on The Tech Report.

]]>
Today, Crypto.com has more than ten million users globally...and continues to grow. Here are a few pros and cons of using Crypto.com.

As the cryptocurrency world expands, people are turning to various websites for all of their crypto needs. Based in Singapore and founded in 2016, Crypto.com has more than 10 million users globally and continues to grow as cryptocurrency becomes the norm. The platform lets users buy, sell, trade, and store 250+ kinds of coins on the platform. Here are four pros and cons of using Crypto.com to consider before deciding.

1. Pro: Lower fees when using cash.

If a user chooses to purchase Crypto.com using funds transferred using a bank account from an Automated Clearing House (ACH) transfer, there aren’t any transaction fees. On top of the ease of buying from a bank account, users can deposit money into the crypto accounts using a bank, or even wire, transfer with no fee. This also applies to buying and selling cryptocurrencies at value.

  • Con: Higher fees when using debit and credit.

When making debit and credit card purchases, the fees are much higher — as high as 2.99%. Avoiding fees is possible, but not without some additional steps. Many users are opting to accept these fees for the convenience of funding their Crypto.com account using cash. Credit card payments may also incur cash advance fees if used for crypto purchases.

2. Pro: Easy to convert fiat currency.

Not only does Crypto.com make converting fiat currencies into crypto much more straightforward, but the platform also accepts multiple fiat currencies, including AUD, CAD, EUR, GBP, USD, USDC, and BRL. The fiat wallet allows users to withdraw or deposit money to and from their bank accounts. Fiat wallets within the platform aid in making crypto purchases, withdrawing fiat from your bank account, and more.

  • Con: Some services are unavailable in the United States.

While most services are available for customers in the United States using the mobile app, there are several services not yet available to people in America. Margin trading and other exchange options are not available for U.S. residents at this time.

3. Pro: Many cryptocurrency assets to choose from.

Crypto.com has a massive selection of cryptocurrencies available. Currently, the platform has 250+ cryptocurrencies available. Having so many cryptocurrencies available on one platform is excellent for people who buy, sell, and trade multiple kinds of coins.

  • Con: Crypto-to-crypto trading is unavailable.

Crypto-to-crypto trading is not available on Crypto.com , which means cryptocurrencies cannot be exchanged on the platform. If a user has one coin and wants a different kind of coin, they have to sell the first cryptocurrency and then make a second transaction to purchase the coin they want.

4. Pro: Ability to earn interest.

As an account holder on Crypto.com , users can earn up to 14% interest. Interest rates depend on several factors, including the CRO token amount staked and which currency you’re saving. For example, interest rates can increase by depositing longer-term — three-month term interest rates are better than one-month terms. Investing in stablecoins can also help users to earn more interest. This way, users earn more interest.

  • Con: Card cashback and reimbursements not paid in cash.

Reimbursements and card cashbacks are not paid in cash — they’re paid in CRO (Cronos) Tokens, the native token of the platform. By converting these benefits into crypto, you take on the fees accrued during trading and the ups and downs of the CRO token.

All in all — Crypto.com is becoming more and more mainstream as the realm of Cryptocurrency expands. It’s essential to make sure you’re using the best software. There’s so much to learn about the new crypto world. Before investing, be sure to spend some time learning more about cryptocurrency software packages.

The post Crypto.com: Four Pros and Cons appeared first on The Tech Report.

]]>
Review of Nvidia GeForce GTX 690 Graphics Card https://techreport.com/review/nvidias-geforce-gtx-690-graphics-card/ Mon, 09 May 2022 06:32:00 +0000 http://localhost/wordpress/nvidias-geforce-gtx-690-graphics-card Introduction This article is a review of the Nvidia GeForce GTX 690 graphics card. It includes photos, tables, graphs, information, costs, game tests, and more. The Review of Nvidia GeForce...

The post Review of Nvidia GeForce GTX 690 Graphics Card appeared first on The Tech Report.

]]>
Introduction

This article is a review of the Nvidia GeForce GTX 690 graphics card. It includes photos, tables, graphs, information, costs, game tests, and more.

The Review of Nvidia GeForce GTX 690

Man, I have to say, thank goodness for tablet computers, smart phones, Xboxes, and whatever else is distracting the masses from PC hardware. The rise of those other devices is supposed to be siphoning off interest from the personal computer, and on some levels, that must be true. There was a time when, as a guy who wrote for TR, I could make conversation with friends and relations about the latest Pentium or Radeon or whatever. These days, those conversations are all about iPads and such instead.

At the same time, it appears those of us still paying attention to, you know, the most powerful consumer computing platform are living in some sort of a magical future-land where our most persistent gripes have been replaced by difficult choices between multiple amazing options. Want a quiet case? Easily done. Want a case capable of housing powerful hardware? Easily had, as well. Want a case that’s both at once? Also readily available. Need a power supply? This one’s modular and makes zero noise at idle. Lousy keyboard got you down? Here, take your pick from ten different mechanical offerings with four different switch types. This one will massage your fingertips as you type.

Decent Computer Parts

Decent computer parts can still be had for cheap, but if you want to pay more in order to get something that’s higher quality, the choices now are better than ever. Component makers are increasingly catering to the desires of PC hobbyists, and frankly, we could get used to it. Already are used to it, really, except when we’re occasionally surprised by another nifty example of the trend.

We were a little startled recently when we received a package containing nothing but the prybar you see above. No, as far as we know, this isn’t a new motion controller for Left 4 Dead, although that would be awesome. Instead, it’s just a big, metal implement. We weren’t sure what to make of it until several days later, when the following arrived:

Without the prybar, I would have surely chipped a tooth trying to get that crate open, so thank goodness.

Inside was the subject of our attention today, the GeForce GTX 690. Yes, this is the new uber-high-end graphics card from Nvidia that packs two of the GK104 graphics processors found in the GeForce GTX 680. You can probably tell from the picture above that the GTX 690 looks a little different from your typical graphics card. Here are a couple more shots, closer up, to give you a better sense of things.

Yes, the GTX 690 looks distinctive. What may not be obvious from the pictures is that the card’s sleek lines and metallic color palette are not a plasticky imitation of something better, as one might expect given the history here. Instead, this premium graphics card is built with premium materials. The silver-colored portions of the cooling shroud are chromium-plated aluminum, and the black piece between them is made of magnesium. Beneath the (still plastic) windows on either side of the cooling fan, you can see the nickel-plated fins of the card’s dual heatsinks. Oh, and there’s a bit of a light show, too, since the green “GeForce GTX” lettering is LED-illuminated.

Touch a fingertip to the cool, solid surfaces of the GTX 690, and you can feel the extra expense involved. Use that fingertip to give the cooling fan a spin, and you’ll feel it there, too, in a motion that’s perceptibly smoother than most. No expense was spared on the materials for this card, and it shows in little ways that, we’ll admit, not everyone will appreciate. We can’t help but like it, though. In terms of look and feel, if the GTX 690 has a rival among current video cards, it may be XFX’s aluminum-shrouded Radeons. But you’ll need two of those in order to approach the GTX 690’s performance.

Nvidia tells us it has invested heavily in tuning the acoustics of the GTX 690’s cooler, as well. Beyond the fancy fan mechanism, the base plate beneath the fan features ducted air channels. Mounted on the board are very low profile capacitors, intended to reduce turbulence in the air flowing across the heatsinks. Time constraints have kept us from disassembling our GTX 690 yet, but below are a couple of stock pictures of the areas in question. As you can see, the GTX 690’s cooler is designed to send air flowing in two directions: half toward the back of the card and outside of the case, and half toward the front of the card, into the PC enclosure.


Source: Nvidia.


Source: Nvidia.

All told, Nvidia expects the GTX 690’s cooler to be very quiet for what it is—quieter even than some of the firm’s single-GPU cards. We’ll test that claim shortly, of course.

Specs like EyeMasters

GPU
base
clock
(MHz)
GPU
boost
clock
(MHz)
Shader
ALUs
Textures
filtered/
clock
ROP
pixels/
clock
Memory
transfer
rate
Memory
interface
width
(bits)
Peak
power
draw
GeForce GTX 680 1006 1058 1536 128 32 6 GT/s 256 195W
GeForce GTX 690 915 1019 3072 256 64 6 GT/s 2 x 256 300W

The GeForce GTX 690’s specifications are eye-popping, which is mostly what you’d expect from an SLI-on-a-stick graphics card. All of the GK104’s units are enabled, so many of the key rates are twice the GTX 680’s. Since the GTX 690 is a dual-GPU pairing, of course, the peak graphics rates shown in the table above are somewhat less connected to reality than usual. Applications may or may not take advantage of all of that power depending on many things, some of which we’ll discuss shortly.

The GTX 690 does have loads of bandwidth on tap, though. Between the two GPUs is a PCI Express bridge chip supplied by PLX; it has 48 lanes of PCIe Gen3 connectivity, 16 lanes routed to each GPU and 16 lanes connected to the host system.

Although the prior-generation GeForce GTX 590 performed more like a couple of down-spec GTX 570s, Nvidia has been able to reach relatively higher with this new card. Some of the credit goes to the Kepler generation’s new GPU Boost dynamic voltage and frequency scaling feature, which raises clock speeds to take advantage of any available thermal headroom. The GTX 690’s “base” clock is lower than the GTX 680’s by quite a bit, but the 690 has more range built into it. The 690’s “boost” clock of 1019MHz isn’t far from the GTX 680’s boost clock of 1058MHz. If the workload and the ambient conditions allow enough headroom, the GTX 690 should operate at something close to its boost clock rate—sometimes at even higher frequencies than that. As a result, Nvidia expects the GTX 690 to perform very similarly to a pair of GTX 680s in SLI.

That’s pretty impressive in the grand scheme, since the GTX 680 can claim to be the world’s fastest GPU. However, the GK104 graphics processor isn’t exactly a heavyweight, by most other standards; it’s a mid-sized chip with a modest power envelope and a 256-bit memory interface. In many ways, the GK104 is more like the GF104 chip that powers the GeForce GTX 560 Ti than it is like the GF110 chip that powers the GTX 580. That lineage is probably why the GTX 690 has “only” 2GB of memory per GPU—4GB in total, but effectively 2GB for all intents. (The Radeon HD 7970, by contrast, has 3GB for a single GPU.) The GK104’s middleweight status was no doubt helpful when Nvidia was attempting to cram two GPUs onto a single card in a reasonable power envelope. In fact, the GTX 690’s max power rating of 300W is 65W lower than the GTX 590’s.


The GeForce GTX 590 (left) versus the GTX 690 (right)

In many other respects, the GTX 690 mirrors its predecessor. The board is 11″ long, occupies two expansion slots, and requires a pair of 8-pin aux power inputs. The display outputs include a trio of dual-link DVI ports and a single mini-DisplayPort connector. The GTX 590 is just very, uh, plasticky by comparison.

Peak pixel
fill rate
(Gpixels/s)
Peak bilinear
texel filtering
rate int/FP16
(Gtexels/s)
Peak shader
arithmetic
(TFLOPS)
Peak
rasterization
rate
(Gtris/s)
Peak
memory
bandwidth
(GB/s)
GeForce GTX 580 37 49/49 1.6 3.1 192
GeForce GTX 680 34 135/135 3.3 4.2 192
GeForce GTX 590 58 78/78 2.5 4.9 328
GeForce GTX 690 65 261/261 6.5 8.2 384
Radeon HD 7970 30 118/59 3.8 1.9 264
Radeon HD 6990 53 159/80 5.1 3.3 320
Radeon HD 6990 AUSUM 56 169/85 5.4 3.5 320

The GTX 590 is also quite a bit slower than the 690, in theory, as is nearly every other video card out there. Only the Radeon HD 6990 with its special “AUSUM” overclocking switch thrown really competes on any of the key rates, like memory bandwidth and texture filtering—and in other important respects, it’s not even close. We’re not likely to see a true competitor for the GTX 690 until AMD takes the wraps off of its dual-Tahiti card. Frankly, we’re kind of surprised to have made it this far into 2012 without seeing AMD’s dual-GPU entry for this generation, since they’ve been talking about it for some time. Until that product arrives, the GTX 690 is pretty much in a class by itself.

That brings us, inevitably, to the question of price. Given the GTX 690’s premium materials and performance, Nvidia has decided to slap a price tag on this puppy that reads: $999.99, one penny short of a grand. I believe that makes the GTX 690 the most expensive consumer graphics card ever. The one-grand sticker essentially doubles the GTX 680’s list price, so it makes a sort of sense. Still, you’d kind of hope for some sort of volume discount when buying two GPUs together, wouldn’t you?

I dunno. I’m not sure the folks who would pony up for this sort of card will care that much.

One thing that this, er, formidable price tag could do is keep demand in line with the limited supply of these cards. Most folks are keenly aware that the supply of GK104 chips is rather tight right now, since the GTX 680 is tough to find in stock anywhere. Furthermore, the dual-GPU cards of the last generation, the Radeon HD 6990 and the GeForce GTX 590, seem to have been in short supply throughout their model runs. We expect the GTX 690 to reach online store shelves this week, but we have few illusions about them being plentiful, at least initially.

A Testing Conundrum

As you might recall, we’ve been skeptical about the merits of multi-GPU solutions like the GeForce GTX 690 since we published this article last fall. That piece introduced some new ways to think about gaming performance, and the methods we proposed immediately highlighted some problems with SLI and CrossFire.

Multi-GPU schemes generally divide the work by asking a pair of GPUs to render frames in alternating fashion—frame 1 to GPU 0, frame 2 to GPU 1, frame 3 to GPU 0, and so on. The trouble is, the two GPUs aren’t always in sync with one another. Instead of producing a series of relatively consistent frame delivery times, a pair of GPUs using alternate frame rendering will sometimes oscillate between low-latency frames and high-latency frames.

To illustrate, we can zoom in on a very small chunk of one of our test runs for this review. First, here’s how the frame times look on a single-GPU solution:

Although frame times vary slightly on the single-GPU setup, the differences are pretty small during this short window of time. Meanwhile, look what happens on a CrossFire setup using two of the same GPU:

You can see that alternating pattern, with a short frame time followed by a long one. That’s micro-stuttering, and it’s a potentially serious performance issue. If you were simply to measure this solution’s performance in average frames per second, of course, it would look pretty good. Lots of frames are being produced. However, our sense is that the smoothness of the game’s animation will be limited by those longer frame times. In this short window, adding a second GPU appears to reduce long-latency frames from about 29 ms to about 23 ms. Although the FPS average might be nearly doubled by the presence of all of those low-latency frames, the real, perceived impact of adding a second card would be much less than a doubling of performance.

This problem affects both SLI and CrossFire, including multi-GPU graphics cards like the GTX 690. How much micro-stuttering you find can vary from one moment to the next. In this example, we can see a little bit of jitter from the GTX 690, but it’s fairly minimal.

However, it appears that the degree of jitter tends to grow as multi-GPU solutions become more performance-constrained. That’s bad news in our example for the older dual-GPU graphics cards:

Ouch. If this trend holds up, the more you need higher performance from a multi-GPU solution, the less likely it is to deliver. Kind of calls the value proposition into question, eh?

Things get even trickier from here, for several reasons. Both AMD and Nvidia acknowledged the multi-GPU micro-stuttering problem when we asked them about it, but Nvidia’s Tom Petersen threw us for a loop by asserting that Nvidia’s GPUs have had, since “at least” the G80, a built-in provision called frame metering that attempts to counteract the problem.


Source: Nvidia

The diagram above shows the frame rendering pipeline, from the game engine through to the display. Frame metering attempts to smooth out the delivery of frames by monitoring frame times and, as necessary, adding a slight delay between a couple of points on the timeline above, T_render and T_display. In other words, the GPU may try to dampen the oscillating pattern characteristic of micro-stuttering by delaying the display of completed frames that come “early” in the sequence.

We think frame metering could work, in theory, with a couple of caveats. One obvious trade-off is the slight increase in input lag caused by delaying roughly half of the frames being rendered, although the impact of that should be relatively tiny. The other problem is the actual content of the delayed frames, which is timing-dependent. The question here is how a game engine decides what time is “now.” When it dispatches a frame, the game engine will create the content of that image—the underlying geometry and such—based on its sense of time in the game world. If the game engine simply uses the present time, then delaying every other frame via metering will cause visual discontinuities, resulting in animation that is less smooth than it should be. However, Petersen tells us some game engines use a moving average of the last several frame times in order to determine the “current” time for each frame. If so, then it’s possible frame metering at the other end of the graphics pipeline could work well.

A further complication: we can’t yet measure the impact of frame metering—or, really of any multi-GPU solution—with any precision. The tool we use to capture our performance data, Fraps, writes a timestamp for each frame at a relatively early point in the pipeline, when the game hands off a frame to the Direct3D software layer (T_ready in the diagram above). A huge portion of the work, both in software and on the GPU, happens after that point.

We’re comfortable with using Fraps for single-GPU solutions because it captures frame times at a fixed point in what is essentially a feedback loop. When one frame is dispatched, the system continues through the process and moves on to the next, stopping at the same point in the loop each time to record a timestamp.

That feedback loop loses its integrity when two GPUs handle the work in alternating fashion, and things become particularly tricky with other potential delays in play. Fraps has no way of knowing when a buffer flip has happened at the other end of the pipeline, especially if there’s a variable metering wait involved—so frame delivery could be much smoother in reality than it looks in our Fraps data. By the same token, multi-GPU schemes tend to have some additional latency built into them. With alternate frame rendering, for instance, a frame completed on the secondary GPU must be transferred to the primary GPU before it can be displayed. As a result, it’s possible that the disparity between frame display times could be much worse than our Fraps data show, as well.

So, what to do if you’re us, and you have a multi-GPU video card to review? The best we can say for our Fraps data is that we believe it’s accurate for what it measures, the point when the game engine presents a frame to Direct3D, and that we believe the frames times it captures are at least loosely correlated to the actual display times at the other end of the pipeline. We can also say with confidence that any analysis of multi-GPU performance based solely on FPS averages is at least as wrong as what we’re about to show you. We had hoped to have some new tools at our disposal for this article, including a high-speed camera we ordered, but the camera didn’t arrive in time for this review, unfortunately. We will have to follow up with it at a later date. For now, we’ll have to march ahead with some big, hairy caveats attached to all of our performance results. Please keep those caveats in mind as you read the following pages.

Testing the Nvidia GeForce GTX 590

Test notes

In order to take full advantage of high-end graphics cards these days, you’ve got to ask a lot of ’em. That’s why we decided to conduct our testing for this review with a trio of monitors, all Dell U2410s, each with a display resolution of 1920×1200.

Together, they have a collective resolution of about six megapixels, roughly 50% more pixels than the 30″ monitor we usually use for GPU testing. The increased resolution and complexity made it fairly easy to push the limits of these multi-GPU setups. We even had to go easy on the image quality settings in some cases to maintain playable frame rates.

Most of our multi-GPU pairings were built from cards we’ve tested before, but our GTX 680 team had one brand-new member: Zotac’s GeForce GTX 680 AMP!, a product just announced today. Obviously, that’s not a stock cooler, but it is very swanky. This is an AMP! edition, so its default clock speeds are quite a bit higher than a stock GTX 680’s. The base and boost frequencies are 1111MHz and 1176MHz, well above the stock 1006/1071MHz speeds. Even more impressively, perhaps, the Zotac card’s memory speed is 1652MHz, up from 1502MHz stock. We suspect memory bandwidth may be an important performance limiter on the GTX 680, so the higher RAM speeds are noteworthy. Zotac is asking $549 for this card, 50 bucks above the stock GTX 680’s list price.

For the purposes of this review, we committed the heinous crime of dialing back the Zotac GTX 680 card’s base and memory clock speeds to match the other card in the SLI pairing, which was a standard-issue GTX 680. We’re worried about GPUs being out of sync, after all, and we didn’t want to make matters worse with a mismatch. (We did the same with the XFX Radeon HD 7970, bringing it back to stock clocks to match the other card.) The thing is, the utilities we had on hand wouldn’t let us straightforwardly control the Zotac card’s boost clock, so perfect symmetry eluded us.

With the GTX 680, that is kind of the way of things, though. Nvidia expects slightly variant performance from every GTX 680 card thanks to GPU Boost, which will adjust to the particulars of a card’s thermals, the individual chip’s properties, and such. Two GTX 680s in SLI aren’t likely to run at exactly the same speed, since the thermal conditions at one spot in a system will vary from those at another. Nvidia anticipates that the frame metering capabilities in the GK104 will keep frame delivery consistent, regardless.

Oh, and please note that we tested the Radeon HD 6990 with its “AUSUM” switch enabled, raising its clock speed and PowerTune limits. We saw no reason not to test it in that configuration, given what it is.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and we’ve reported the median result.

Our test systems were configured like so:

Processor Core i7-3820
Motherboard Gigabyte
X79-UD3
Chipset Intel X79
Express
Memory size 16GB (4 DIMMs)
Memory type Corsair
Vengeance CMZ16GX3M4X1600C9
DDR3 SDRAM at 1600MHz
Memory timings 9-9-11-24
1T
Chipset drivers INF update
9.3.0.1019
Rapid Storage Technology Enterprise 3.0.0.3020
Audio Integrated
X79/ALC898
with Realtek 6.0.1.6526 drivers
Hard drive Corsair
F240 240GB SATA
Power supply Corsair
AX850
OS Windows 7 Ultimate x64 Edition
Service Pack 1
DirectX 11 June 2010 Update
Driver
revision
GPU
base
core clock
(MHz)
Memory
clock
(MHz)
Memory
size
(MB)
GeForce
GTX 590
ForceWare
301.24
608 854 3072
GeForce
GTX 680
ForceWare
301.33
1006 1502 2048
GeForce
GTX 680 + Zotac GTX 680
ForceWare
301.33
1006 1502 2048
GeForce GTX
690
ForceWare
301.33
915 1502 4096
Radeon
HD 6990 AUSUM
Catalyst
12.4 + 12.3 CAP 1
880 1250 4096
Radeon
HD 7970
Catalyst
12.4 + 12.3 CAP 1
925 1375 3072
Radeon
HD 7970 + XFX HD 7970
Catalyst
12.4 + 12.3 CAP 1
925 1375 3072

Thanks to Intel, Corsair, and Gigabyte for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following test applications:

Some further notes on our methods:

  • We used the Fraps utility to record frame rates while playing a 90-second sequence from the game. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We tested each Fraps sequence five times per video card in order to counteract any variability. We’ve included frame-by-frame results from Fraps for each game, and in those plots, you’re seeing the results from a single, representative pass through the test sequence.
  • We measured total system power consumption at the wall socket using a Yokogawa WT210 digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.

    The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Skyrim at its Ultra quality settings with FXAA enabled.

  • We measured noise levels on our test system, sitting on an open test bench, using an Extech 407738 digital sound level meter. The meter was mounted on a tripod approximately 10″ from the test system at a height even with the top of the video card.

    You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

  • We used GPU-Z to log GPU temperatures during our load testing.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

The Elder Scrolls V: Skyrim

Our test run for Skyrim was a lap around the town of Whiterun, starting up high at the castle entrance, descending down the stairs into the main part of town, and then doing a figure-eight around the main drag.

Since these are pretty capable graphics cards, we set the game to its “Ultra” presets, which turns on 4X multisampled antialiasing. We then layered on FXAA post-process anti-aliasing, as well, for the best possible image quality without editing an .ini file. We also had the high-res texture pack installed, of course. Although it’s not pictured above, the total display resolution was 5760×1200.

Frame time
in milliseconds
FPS
rate
8.3 120
16.7 60
20 50
25 40
33.3 30
50 20

These first three plots show the raw data from a single test run, the rendering times for each individual frame, shown in milliseconds. Notice that because we’re thinking in terms of frame latency, lower numbers are better. For reference, we’ve included a table on the right showing the conversions from frame times to FPS.

As you can see, the GTX 690 performs essentially identically to two GTX 680s in SLI. Throughout the test run, the GTX 690’s frame latencies remain below 22 milliseconds or so, and they’re often under the magical 16.7 millisecond mark that, if it’s steady, translates into 60Hz or 60 FPS. Some of the other cards don’t fare as well, especially the Radeon HD 6990, whose spike-riddled plot reveals frame times that are often rather high. The GTX 590’s plot looks more like a cloud than a line, suggesting that it has some jitter going on, as well.

Just looking at the FPS average, the GTX 690 ties the GTX 680 SLI team, well ahead of anything else. Two Radeon HD 7970s in CrossFire, surprisingly enough, aren’t any faster than a GeForce GTX 590.

Of course, we’ve established that FPS averages don’t tell the whole story. We can get a better sense of the overall frame latency picture by looking at the 99th percentile frame time. Simply put, for each card, this number means that 99% of all frames were rendered in x milliseconds or less. Since we’re looking at a point where the vast majority of frames have been completed, the effects of any micro-stuttering oscillations will be reflected in this result.

Switching to this more latency-centered indicator does some interesting things for us. First and foremost, it brings the GTX 690 and the GTX 680 SLI back to the pack. Those two are only a couple of milliseconds ahead of a single GTX 680 in this measurement. Oddly enough, the Radeon HD 7970 CrossFire config looks to have higher latencies at the 99th percentile than a single 7970 card does. Worst of all for AMD, the Radeon HD 6990 looks like a basket case. Going by FPS alone, the 6990 would appear to be just a few ticks behind the 7970. A look at the latency picture reveals the gulf between the 6990 and everything else.

Then again, 99th percentile frame times are just one point along a whole latency curve, and we can show you how that curve looks.

With multi-GPU products in the mix, these latency curves are more interesting than ever. You can see that the Nvidia Geforce GTX 690 and the 680 SLI config are evenly matched throughout the test run, with no real weaknesses. Both solutions deliver frames quickly throughout, although their frame latencies rise, nearly to meet the single GTX 680’s, in the last 5% of frames.

The curve for the Radeon HD 7970 CrossFire setup tells quite the story. Although the dual 7970s deliver half of their frames much more quickly than a single card, their frame times rise at a sharper angle beyond 50%, eventually crossing over at around 82-83%. For the last 16% or so of frames delivered, the single Radeon HD 7970 is quicker. We’re likely seeing two halves of a multi-GPU jitter pattern illustrated in the 7970 CrossFire’s more rapidly ascending curve, and in the final analysis, the single 7970 may be the better of the two solutions.

We can also quantify “badness,” the slowdowns and delays one encounters while playing a game, by looking at the amount of time spent rendering frames above a certain threshold. The theory here is that the more time spent on long-latency frames, the more interruption you’re likely to perceive while playing a game.

We’ve chosen several noteworthy thresholds. The first, 50 milliseconds, equates to 20 FPS. We figure if the frame rate drops below 20 FPS for any length of time, most folks are likely to perceive a slowdown. The next two, 33.3 ms and 16.7 ms, equate to 30 and 60 FPS, respectively, which are traditionally important performance thresholds for gamers. Our three thresholds also equate to 60Hz, 30Hz, and 20Hz, the first three quantization points for a 60Hz display with vsync enabled. If you go beyond any of these points, you’ll be waiting at least one more vertical refresh interval before updating the screen.

As you might have expected, only the Radeon HD 6990 suffers any really substantial slowdowns, and even it doesn’t waste too much time working on frames above 50 milliseconds.

When we ratchet the threshold down to 16.7 ms, the GTX 690 and 680 SLI really separate themselves from the pack. A single GTX 680 card spends about three times as long as a GTX 690 or two GTX 680s in SLI above 16.7 milliseconds—so in a really picky way, the GTX 690 is measurably better at minimizing wait times for those worst-case frames.

Notably, the Radeon HD 7970 single-card and CrossFire configs are essentially tied here. Adding a second Radeon doesn’t appear to help at all in the most difficult cases.

Batman: Arkham City

We did a little Batman-style free running through the rooftops of Gotham for this one.

All of the plots for this game show lots of spikes, or occasional long frame times, something we’re used to seeing from Arkham City—doesn’t seem to matter much whether you’re dealing with one GPU or two.

The AMD and Nvidia offerings are much more competitive with one another here than they were in Skyrim, at least in the FPS sweeps.

Switching to the 99th percentile frame times whittles away the gap between the multi-GPU solutions and their single-chip equivalents. The GTX 690 still looks good here, but it’s no quicker than a pair of 7970s.

Although all of the solutions had spiky lines in the initial plots of frame times, the latency curve illustrates how two of the solutions, the single Radeon HD 7970 and the Radeon HD 6990, produce a higher proportion of long-latency frames than everything else. Again, the 6990’s sharper rise from the halfway point to about 88% suggests some longer frame times as part of a micro-stuttering pattern. However, the single 7970 struggles mightily with that last 7% of frames, all of which take longer than 60 milliseconds to render. Interestingly, in this case, adding a second card for CrossFire essentially eliminates those struggles.

The GTX 690 again looks excellent throughout, even surpassing the 7970 CrossFire config.

In this test scenario, with either Radeons or GeForces, you can substantially reduce slowdowns by adding a second GPU. That’s true for the GTX 680/690, and it’s true for the Radeon HD 7970, as well. The multi-GPU options look pretty darned good in that light.

Battlefield 3

We tested Battlefield 3 with all of its DX11 goodness cranked up, including the “Ultra” quality settings with both 4X MSAA and the high-quality version of the post-process FXAA. Our test was conducted in the “Operation Guillotine” level, for 60 seconds starting at the third checkpoint.

The multi-GPU plots look… cloudy. Could prove interesting.

Notice the disparity between the FPS average and the 99th percentile frame times. Although the 7970 CrossFire config is far and away the at the top of the FPS charts, it’s only slightly quicker than the GTX 690 at the 99th percentile. Why is that?

Well, both the 7970 CrossFire and the Radeon HD 6990 show a classic micro-stuttering split, as their frame times rise sharply at around the 50th percentile mark. The disparity is greater for the 6990, which looks to be more performance constrained. Even with that handicap, the 7970 CrossFire config at least matches the GTX 690 and 680 SLI across the latter half of the latency curve, which is why it still has a slight advantage at the 99th percentile. In other words, the 7970 CrossFire config is still a very competitive performer, even though it doesn’t quite live up to its towering advantage in the FPS results.

The worst performer here is the GeForce GTX 590, whose last 10% of frames take between 60 and 120 milliseconds to render.

I like the filtering effect we get from these three thresholds. If you’re looking to avoid real slowdowns, consider the 50-ms results, where only the 6990 and the GTX 590 really show any notable issues. At 33.3 ms, the three multi-GPU solutions from the current generation are in a class by themselves, while the Radeon HD 7970 carves out a clear advantage over the GeForce GTX 680. You’re better off with either current-gen single GPU than you are with a dual-GPU card from the prior generation, though. Crank the limit down to 16.7 ms, and the ranks of the top three remain the same, but all of the other solutions look to be pretty similar.

Crysis 2

Our cavalcade of punishing but pretty DirectX 11 games continues with Crysis 2, which we patched with both the DX11 and high-res texture updates.

Notice that we left object image quality at “extreme” rather than “ultra,” in order to avoid the insane over-tessellation of flat surfaces that somehow found its way into the DX11 patch. We tested 60 seconds of gameplay in the level pictured above, where we gunned down several bad guys, making our way across a skywalk to another rooftop.

You’ll see a couple of unexpected things in these results. First, the GeForce GTX 680 SLI is quite a bit slower than the GTX 690, which is unusual. Second, neither of those solutions performs very well compared to the Radeon HD 7970 CrossFire team. Why?

Well, looks like we found a bug in Nvidia’s graphics drivers. Here are the FPS averages for the GTX 680 SLI rig, runs one to five: 62, 44, 43, 44, 44. And here are the averages for the GTX 690: 59, 52, 51, 52, 50. In both cases, the first test run is faster than all of the rest. The GTX 590 suffered from the same problem, but none of the single-card configs did, nor did any of the Radeon setups. Looks like something bad happens when you exit and load a saved game on the Nvidia multi-GPU setups. After you’ve done that once, performance drops and doesn’t come back until you exit the game completely and start it up again. For whatever reason, the GTX 680 SLI setup suffers more from this issue than the GTX 690 does.

I briefly considered re-testing the Nvidia Geforce GTX 690 and company by exiting the game between runs, but I figure this problem is one that anybody who plays Crysis 2 will encounter. Seems like fair game to include it. Multi-GPU solutions tend to be prone to these sorts of issues, anyhow.

While we’re at it, these results are affected by a problem with the Radeons, as well. Let’s zoom in on the very beginning of the test runs to get a closer look.

One of the first motions of our test run, after loading a saved game, is to turn around 180° or so and face another direction. When we do so on the Radeon cards, we get some long delays, multiple frames that take 60 milliseconds or more. I didn’t know whether to blame the game engine or the Radeons, and I was considering doing a quick spin-around move to warm the GPU caches before starting the test run—until I started testing the GeForces, and the slowdowns became vastly less pronounced, almost imperceptible in most cases. You can see a single 70-millisecond frame from the GTX 690 above, but the following frames clock in at under 50 ms. There’s a lot more white area under those two Radeon lines, which means more time spent waiting. Again, I figured this problem was fair game to include, in the grand scheme of things.

Notice, also, that you can see the multi-GPU jitter patterns in both the GTX 690 and the 7970 CrossFire plots above. The GTX 690’s is less pronounced, but it’s still unmistakable.

Even with these two issues affecting the different camps, we still have some clear takeaways from these results. Of course, the Radeons all pay for that slowdown at the beginning of the test run in our 50-ms threshold results. There’s no escaping that.

Beyond that, the prior-generation multi-GPU cards look really quite poor, with the two worst latency curves from the 50th percentile on up and, thus, the most time spent beyond each of our three thresholds. You’re obviously better off with a single GTX 680 or 7970 than with a GTX 590 or 6990.

Finally, even with the Nvidia driver issue, the GTX 690 comes out of this test looking quite good in terms of the overall latency picture and its ability to avoid slowdowns.

Power consumption

These power consumption numbers aren’t quite what we’d expected, based on prior experience. Driving three monitors appears to change the math in some cases. For example, the Radeon HD 7970’s ZeroCore power feature evidently doesn’t work with three displays attached. Both the single 7970 and the primary card in our CrossFire team refused to enter the ZeroCore state when the display dropped into power-save mode. Their fans never stopped spinning. Not only that, but the idle power consumption numbers for a single 7970 are quite a bit higher than we saw with a single display attached. (And we’re not measuring the display power consumption here, just the PC’s power draw at the wall socket.) We’re also not sure why the GeForce GTX 680 SLI rig pulled more power with the display in power-save mode, but it did.

Given everything, the GTX 690’s power consumption is remarkably low, both when idle and when running Skyrim across three displays, which is how we tested under load. Our Nvidia geforce GTX 690 based test system pulled fully 100W less than the same system with a GTX 590 installed.

Noise levels and GPU temperatures

The NVIDIA GeForce GTX 690’s acoustic profile is, if anything, even more impressive than its power draw numbers. Nvidia’s new dual-GPU card achieves noise levels very similar to a single GeForce GTX 680, which is one of the quieter cards on the market. The 690 doesn’t come by its low decibel readings cheaply, either—it maintains lower GPU temperatures under load than almost anything else we tested.

If you’re wondering about why the single Radeon HD 7970 produced higher noise levels and higher temperatures than the 7970 CrossFire config, well, so am I. Some of the issue, I think, is that we have an asymmetrical CrossFire team, with different coolers on the two cards. Somehow, using them together produces a different fan speed policy, it seems. Also, of course, noise is not additive, so putting in a second card doesn’t always lead straightforwardly to higher decibel readings. Another contributor may be relatively higher GPU utilization in the single-card config, since 7970 CrossFire performance doesn’t appear to scale well in Skyrim. We may have to try testing with a different game next time.

Final Conclusions and Closing Thoughts

Well, now we have some performance numbers for the GeForce GTX 690. How correct they are, we’re not entirely sure. I will say this: regardless of the fact that we’ve not accounted for the potentially positive effects of frame metering, the GeForce GTX 690 looks to be the fastest overall graphics card on the planet. The GTX 690 even does well in our latency-sensitive metrics. Although it’s rarely twice as fast as a GeForce GTX 680 in terms of 99th percentile frame times, the GTX 690’s overall frame latency picture, as measured in Fraps, is generally superior to the GTX 680 by a nice margin. The GTX 690 also does a nice job of minimizing those painful instances of long-latency frames, improving on the performance of the GTX 680 in that regard in solid fashion.

Since we’re not entirely confident in our numbers, I’ll offer a few subjective impressions from my testing, as well. I started the process with the dual 7970 CrossFire team, and I chose the image quality settings for testing based on how well that config could run each game. My goal was to stress the multi-GPU solutions appropriately without cranking up the image quality options so far the games would be unplayable on the single-GPU cards.

I was initially surprised to see how easily the 7970 CrossFire config could be pushed to its limits at a six-megapixel resolution. The settings I chose yielded playable but not entirely fluid animation on the 7970 CrossFire rig (with the exception of BF3, which was stellar on the 7970s with the quality maxed out.) I was fearful of whether these games would truly be workable on a single 7970, but it turns out that I shouldn’t have worried. For the most part, playability wasn’t substantially compromised on a single card. However, playability was compromised when I switched over to the Radeon HD 6990. Although its FPS averages were generally higher than a single 7970’s, the experience of playing games with it was instantly, obviously worse. You might have guessed that by looking at our latency-focused numbers from Fraps, but the subjective experience backed that up.

From there, I switched to the green camp and the GeForce GTX 590, which was a bit of an upgrade from the 6990 in terms of overall smoothness—and it wasn’t such a basket case in Skyrim. When I swapped in a single GTX 680, though, the experence changed. The GTX 680 felt like a clear improvement in playability, after having tested the 6990 and GTX 590 back to back before it. The power of a single, fast GPU is something to be respected. I remember seeing the GTX 680’s FPS average at the end of the first Arkham City test run and being shocked at how low the number was (the card averaged 32 FPS) given the quality of the seat-of-the-pants experience.

Then again, I wasn’t getting much sleep during this period, and I’d overclocked my entire nervous system via copious amounts of tasty Brazilian coffee. Cerrado Gold, baby. Breakfast of champions.

The GTX 680 SLI config and the GTX 690 came next, and subjectively, the experience offered by the two was indistinguishable. Both were obviously faster in places where the GTX 680 felt strained, and I’d say they offered a better experience overall—and thus the best of any config I tested. However, it seemed like they’d still run into occasional, brief episodes of sluggishness that one didn’t experience on a single GTX 680.

You can make of those subjective impressions what you will. They’re in no way scientific, although I did try to throw in a big word here and there.

Given our seat-of-the-pants impressions and our test results, I’m pretty confident in offering a generally positive recommendation for the GeForce GTX 690. No, it’s not “twice as fast” as a GeForce GTX 680 in any meaningful sense, and a coldly rational buyer probably wouldn’t want to pay twice the GTX 680’s price for it. However, it is as quick as two GTX 680s in SLI, which makes it the highest-performance video card we’ve ever used. Furthermore, Nvidia has gone above and beyond with the GTX 690’s industrial design, materials, acoustics, and power efficiency, all of which are exemplary, outstanding, and most other positive words you might wish to use.

Cost Justification?

If you’re serious about buying one of these cards, you probably understand the logic of such a purchase better than I do. I’m not sure how one would justify the price, but Nvidia has given folks lots of shiny little excuses—and they’ve muted the drawbacks like excess noise, too. There’s not much not to like here, other than that fantastic, breathtaking, prodigious price tag. I suspect some folks will overcome that one obstacle without much trouble.

As for the lingering questions about multi-GPU micro-stuttering and the effectiveness of frame metering, we have several things in the works. There’s a tremendous amount of work still to be done, and we have a lot of other projects on our plate, so be patient with us, folks. Shortly before the completion of this review, we did finally receive that high-speed camera we’d ordered. We’ve already managed to capture a serviceable video at 240 FPS, four times our display’s refresh rate. The resolution isn’t too high, but it’s enough to show whether the animation is smooth. Have a look:

You can easily see the strobe of the monitor’s CCFL backlight, and since we had vsync disabled, several instances of tearing are clearly visible. We think this tool should allow for some worthwhile comparisons, eventually.

All the cool kids follow me on Twitter.

The post Review of Nvidia GeForce GTX 690 Graphics Card appeared first on The Tech Report.

]]>
Our 6 Favorite Tech Review Podcasts https://techreport.com/review/tech-review-podcasts/ Thu, 05 May 2022 16:35:27 +0000 https://techreport.com/?p=3477180 Which tech review podcast is best for you and your technology usage? Here we sift through the noise to offer the six we find most helpful.

Technology has a massive influence on our world today. New smartphones, gadgets, and software seem to be released every single week. Each with newer and better features than the last....

The post Our 6 Favorite Tech Review Podcasts appeared first on The Tech Report.

]]>
Which tech review podcast is best for you and your technology usage? Here we sift through the noise to offer the six we find most helpful.

Technology has a massive influence on our world today. New smartphones, gadgets, and software seem to be released every single week. Each with newer and better features than the last. With so many new innovations, it can be difficult for normal people to keep track of. Luckily, with this new wave of technology, we are also receiving a new wave of media content. One such form of this new content takes shape in tech review podcasts.

Podcasts are one of the largest forms of new media, with hundreds of millions of listeners every single day. The podcasts take on many different shapes and forms, but some of our favorites are tech review podcasts.

These podcasts take the form of evaluating, analyzing, and reviewing new tech releases. Oftentimes these tech review podcasts are extremely helpful and informative when considering making a new purchase. So, without further adieu, here are our six favorite tech review podcasts.

Clockwise

The Clockwise podcast is a great option for any tech-savvy listeners on a shorter deadline.

Hosted by Dan Moren and Mikah Sargent, this podcast features two guest stars in each episode joining the host. During the show, listeners will get to hear as the hosts break down and discuss four different tech topics each show.

The show is great because it is released frequently, ensuring listeners can stay up to date on current news in the tech world.

Our only complaint about the show is that sometimes we wish it were longer. Because it is simply that good. Clockwise is an easy choice as one of our favorite tech review podcasts.

Accidental Tech

Hosted by Casey Liss, Marco Arment, and John Siracusa, this podcast is a great option for die-hard tech enthusiasts.

The show’s hosts are all developers and, as such, they provide detailed insight on all things tech. The show is carried by a combination of detailed insight and chemistry between the hosts.

The show is informative and intelligent but never loses its ability to have fun and keep things light. If you are interested in all things tech, from programming to development, then you need to check out the Accidental Tech podcast.

Rocket

With no official website, the Rocket podcast can be located on Twitter, Apple podcasts, relay FM, and more.

The show features three female co-hosts: Brianna Wu, Simone de Rochefort, and Christina Warren. One of the first and most popular all-female tech podcasts, Rocket is great for anyone looking to really get into their geek mode.

The three hosts discuss everything from tech to comics and movies. The banter between the hosts is light and cheerful, and it is an enjoyable listen every episode. Rocket is easily one of our favorite tech review podcasts.

Tech Talker’s Quick and Dirty Tips

The next entry on our list of favorite tech review podcasts is best for those who don’t have time to listen to a podcast.

Tech Talker episodes generally last under 10 minutes and provide helpful information on the technological world. These shows usually take the form of quick tips and guidance for listeners on subjects such as cybersecurity and more.

While not the most entertaining podcast on the list, this show provides listeners with quick and informative information. The unique and insightful layout of the show makes it a great listen for anyone on the move.

Note to Self

The Note to Self podcast focuses on examining technology from the human aspect of things.

It examines more closely the moral and ethical implications of our rapid advancement with technology. The show is hosted by Manoush Zomorodi and episodes usually last between 20-30 minutes.

The podcast focuses on helping you to maneuver in the now digital world, and how to maintain our humanity despite technological innovations. The deep discussion on this show makes it an easy choice as one of our top tech review podcasts.

The Big Web Show

Last but not least is The Big Web Show, hosted by Jeffrey Zeldman. With episodes often running over an hour in length, this is a great show for long commutes.

During each show, Zeldman invites an expert from various fields on the web to share their stories. The show achieves a great balance of intellectual discussion, advice, and anecdotes.

The host, Jeffrey Zeldman, does an outstanding job of interviewing each guest and keeping each show fun and informative. Each guest brings a unique perspective and set of experiences that keep the show engaging and informative throughout. The Big Web Show is a fantastic option for any die-hard tech fans who want to learn more about the web.

The post Our 6 Favorite Tech Review Podcasts appeared first on The Tech Report.

]]>
Full Review of AMD Radeon R9 285 graphics card https://techreport.com/review/amds-radeon-r9-285-graphics-card-reviewed/ Mon, 02 May 2022 13:09:00 +0000 http://localhost/wordpress/amds-radeon-r9-285-graphics-card-reviewed Introduction This article is a review of AMD Radeon R9 285 graphics card. It includes photos, tables, information, graphs, game testing, costs, and more. Full Review of AMD Radeon R9...

The post Full Review of AMD Radeon R9 285 graphics card appeared first on The Tech Report.

]]>
Introduction

This article is a review of AMD Radeon R9 285 graphics card. It includes photos, tables, information, graphs, game testing, costs, and more.

Full Review of AMD Radeon R9 285 Graphics Card

As a guy who reviews video cards, it’s pretty easy to become cynical about these things. That’s been especially true during the past couple of years, as we’ve seen the same handful of graphics chips spun into multiple “generations” of products. The core GPU technology is a technological wonder, but the endless re-spins get to be tiresome.

When AMD revealed the imminent arrival of the AMD Radeon R9 285 recently, I have to admit, I wasn’t exactly thrilled. Yes, the R9 285 would be based on a new chip, code-named Tonga, but that chip just looked to be a cost-reduced and slightly tweaked variant of existing silicon—not quite the stuff of legend.

The Radeon R9 285 by the numbers

Heck, have a look at the specs for the Radeon R9 285 versus the card it replaces, and you’ll see what I mean.

GPU
boost
clock
(MHz)
ROP
pixels/
clock
Textures
filtered/
clock
Shader
processors
Memory
interface
width
(bits)
Memory
transfer
rate
Board
power
Starting
price
Radeon
R9 280
933 32 112 1792 384 5 GT/s 250W $279
Radeon
R9 285
918 32 112 1792 256 5.5 GT/s 190W $249
Radeon
R9 280X
1000 32 128 2048 384 6 GT/s 250W $299

In terms of key specs, the principal change from the Radeon R9 280 to the R9 285 is the move from a 384-bit memory interface to a 256-bit one. The narrower interface should make the R9 285 cheaper to produce, but it will also mean less memory bandwidth—and memory bandwidth is one of the primary performance constraints in today’s graphics cards. Aside from the reduction in memory throughput, the R9 285 appears to be very similar to the R9 280 card that it replaces (and to the Radeon HD 7950 that came before it and was essentially the same thing.)

Ho-hum, the cost.

Worse, the $249 starting price for the R9 285 doesn’t seem like much of a bargain, given that the R9 280 is going for $219 at online retailers right now, presumably while they close out stock to make room for the new card. That all felt like kind of a raw deal, frankly. What could AMD be thinking?

My, uh, lack of enthusiasm was dampened somewhat when the first example of the R9 285 arrived in Damage Labs. Behold the MSI Radeon R9 285 Gaming OC Edition:

This puppy is gorgeous, and its twin-fan cooler performs as well as its looks suggest. There’s good news on the performance front, too, since MSI has cranked up the Boost clock to 973MHz, 55MHz above stock.

At least MSI was doing good work with its part of the equation.

Still, I thought, a snazzy cooler and paint job couldn’t fix the basic problem with the R9 285. Although the new Radeon was up to snuff elsewhere, its memory bandwidth just looked a bit anemic.

Peak pixel
fill rate
(Gpixels/s)
Peak
bilinear
filtering
int8/fp16
(Gtexels/s)
Peak
shader
arithmetic
rate
(tflops)
Peak
rasterization
rate
(Gtris/s)
Memory
bandwidth
(GB/s)
Radeon
R9 280
30 104/52 3.3 1.9 240
Radeon
R9 285
29 103/51 3.3 3.7 176
Radeon
R9 280X
32 128/64 4.1 2.0 288
GeForce GTX
760
33 99/99 2.4 4.1 192
GeForce GTX 770 35 139/139 3.3 4.3 224

The MSI card’s higher boost clocks would give it a bit more oomph in some categories than the stock numbers shown for the R9 285 above, but it wouldn’t do anything to address the biggest issue. The R9 285’s 176 GB/s of memory bandwidth is just a lot less than the R9 280’s 240 GB/s—and quite a bit less than what the competing GeForces have to offer, too.

Tonga’s dilemma

So you can understand my reticence. Tonga looked to be nothing more than a dreary re-spin of AMD’s existing technology, based on the same Radeon DNA as the Hawaii GPU introduced one year ago—and the Bonaire chip first outed in March of 2013. Those chips weren’t that different from the Tahiti GPU that debuted at the end of 2011.

To be fair, AMD did make some notable improvements in Hawaii and Bonaire. Both of those chips have the TrueAudio DSP block onboard, so that games can offload audio processing to a dedicated hardware unit on the GPU. Those chips include a new XDMA data transfer mechanism for CrossFire, which allows frame data to be transferred via PCI Express instead of over an external bridge. Hawaii and Bonaire also have updated display outputs with support for the latest DisplayPort standards.

In fact, AMD tells us that only Radeon cards based on Bonaire, Hawaii, and Tonga will support the variable refresh displays being enabled by its Project FreeSync initiative. I wasn’t aware that older Radeons would be excluded, but apparently they will.

One other addition in Hawaii—and now Tonga—is a smarter version of AMD’s PowerTune dynamic voltage and frequency scaling (DVFS) scheme. The new PowerTune monitors the GPU’s current state constantly and makes fine-grained adjustments to clock speeds and supplied voltages in order to keep the GPU within its pre-defined thermal and power peaks. The smarter PowerTune algorithm allows the graphics chip to squeeze out every ounce of performance possible within those limits.

These tweaks are all well and good, and Hawaii in particular is a truly impressive GPU specimen, but they don’t do that much to improve the GPU’s fundamental performance or efficiency. Hawaii gets its potency from sheer scale more than anything else.

The Reality

That reality was a problem for Tonga, in my view, because Maxwell is coming. Nvidia has already released a small-scale version of its new GPU architecture aboard the GeForce GTX 750 Ti, and we know the little Maxwell is about twice as power-efficient as the corresponding chip based on the Kepler architecture. Nvidia is widely rumored to be prepping larger Maxwell derivatives for release soon. Those chips are likely to convert this architecture’s increased power efficiency directly into higher performance. If Tonga were just a cost-reduced Tahiti chip with TrueAudio added, AMD could be in for a world of hurt.

Tonga: it’s a magical place

Turns out my worries were misplaced, because Tonga is not just a smaller version of Hawaii. A year after the release of that bigger GPU, AMD has slipstreamed some significant new technology into Tonga—and has done so rather quietly, without a branding change or any of the usual fanfare. In fact, I had to prod AMD a little bit in order to understand what’s new in Tonga. I don’t yet have a clear picture of how everything works, but I’ll share what I know.

By far the most consequential innovation in Tonga is a new form of compression for frame buffer color data. GPUs have long used various forms of compression in order to store color information more efficiently, but evidently, the method Tonga uses for frame buffer data is something novel. AMD says the compression is lossless, so it should have no impact on image quality, and “delta-based.” Tonga’s graphics core knows how to read and write data in this compressed format, and the compression happens transparently, without any special support from applications.

We don’t have many details on exactly how it works, but essentially, “delta-based” means the compression method keys on change. My best bet is that whenever a newly completed frame is written to memory, only the pixels whose color have changed from the frame prior are updated. ARM does something along those lines with its Mali mobile GPUs, and I expect AMD has taken a similar path.

The payoff is astounding: AMD claims 40% higher memory bandwidth efficiency. I’m not quite sure what the basis of comparison is for that claim, nor am I clear on whether 40% is the best-case scenario or just the general case. But whatever; we can measure these things.

3DMark Vantage’s color fill test has long been gated primarily by memory bandwidth, rather than the GPU’s raw pixel fill rate. Here’s how Tonga fares in it.

Whoa. Comparing the Two.

Compare the R9 285 to the Radeon HD 7950 Boost, which we used in place of the Radeon R9 280. (Only 8MHz of clock speed separates them.) The 7950 Boost has 240 GB/s of memory bandwidth to Tonga’s 176 GB/s, yet the new Radeon maintains a substantially higher pixel fill rate. That’s Tonga magic in action.

Perhaps my concerns about Tonga’s memory bandwidth were premature. We’ll have to see how well this compression mojo works in real games, but it certainly has my attention.

That’s not all. Tonga has inherited a new front-end and internal organization from Hawaii that grants it more potential for polygon throughput. The triangle setup rate has doubled from two primitives per clock in Tahiti to four per clock in Tonga. Beyond that, Tonga adds some of its own provisions to improve geometry and tessellation performance, including a larger parameter cache that spills into the L2 cache when needed. The division of work between the geometry front-end units has been improved, and these units can better re-use vertices, which AMD says should help performance in cases where “many small triangles” are present.

These architectural modifications more than bring the R9 285 up to par with its nearest rival, the GeForce GTX 760, in terms of geometry throughput. Tonga also surpasses the Hawaii-based Radeon R9 290X in this synthetic test of tessellation performance.

Between the new color compression method and the geometry performance gains, Tonga could plausibly claim to usher in a new generation of Radeon technology. The use of the GCN or “Graphics Core Next” label has proven incredibly flexible inside the halls of AMD, but what we’re seeing here sure feels like a fundamental shift.

That’s not the full extent of the changes, either. AMD has revamped Tonga’s media processing capabilities in order to ensure fluid performance and high-quality images in the era of 4K video. That starts with a new hardware image-scaling block in the display pipeline. This scaler is capable of upscaling to and downscaling from 4K video targets in real time.

In a related move, the graphics core has gained some new instructions for 16-bit integer and floating-point media and compute processing at reduced power levels. Also, both the video decode (UVD) and encode (VCE) engines on Tonga have been upgraded to allow for higher throughput. The UVD block now supports the MJPEG standard and can decode high-frame-rate 4K video compliant with the High Profile Level 5.2 spec. The beefier VCE block can encode 1080p video at 12X the real-time rate and is capable of encoding 4K video, as well.


Source: AMD

We’ve had limited time to test Tonga, so we haven’t been able to scrutinize its video processing chops yet. Above are some encoding performance results that AMD supplied to reviewers showing the R9 285 outperforming the GeForce GTX 760. Make of them what you will.

But wait, there’s more!
One of the strange things about Tonga’s introduction to the world is that it’s debuting in a product where’s its at less than full strength. AMD hasn’t provided a ton of info about the full GPU, perhaps as a result of that fact, but below is my best guess at how Tonga looks from 10,000 feet.

The image above shows eight compute units per shader engine, with four shader engines across the chip. AMD has confirmed to us that Tonga is indeed hiding four more compute units than are active in the R9 285, so the diagram above ought to be accurate in that regard. Here’s my best estimate of how Tonga stacks up in terms of key metrics versus its closest competition.

ROP
pixels/
clock
Texels
filtered/
clock
(int/fp16)
Shader
processors
Rasterized
triangles/
clock
Memory
interface
width (bits)
Estimated
transistor
count
(Millions)
Die
size
(mm²)
Fabrication
process node
GK104 32 128/128 1536 4 256 3500 294 28 nm
GK110 48 240/240 2880 5 384 7100 551 28 nm
Tahiti 32 128/64 2048 2 384 4310 365 28 nm
Tonga 32 128/64 2048 4 256 5000 359 28 nm
Hawaii 64 176/88 2816 4 512 6200 438 28 nm

The die size and transistor count for Tonga above come directly from AMD. What fascinates me about these figures is that Tonga is barely any smaller than Tahiti. The idea that Tonga is a cost-reduced version of Tahiti pretty much goes out of the window right there.

Look at the transistor count, though. Tonga packs in roughly five billion transistors, while Tahiti is less complex, at 4.3 billion. Both chips are made at TMSC on a 28-nm process. How is it that Tonga’s not quite as large as Tahiti yet has more transistors?

Since the chips are separated by three years, I suspect GCN compute units in Tonga are more densely packed than those in Tahiti. AMD has had more time to refine them. That said, we know that the two GPUs have the same number of compute units, so presumably Tonga doesn’t get its much higher transistor count from its shader core. All of the other additions we’ve talked about, including the TrueAudio DSP block, the color compression capability, and video block enhancements, add some complexity. I doubt they’re worth another 700 million transistors, though.

My best guess is that most of the additional transistors come from cache, perhaps a larger L2. SRAM arrays can be very dense, and a larger L2 cache would be a natural facilitator for Tonga’s apparently quite efficient use of memory bandwidth. I’ve pinged AMD about the size of Tonga’s L2 cache but haven’t heard back yet.

Another question these numbers raise is whether Tonga natively has a 256-bit memory interface. Generally, the size of a chip like this one is dictated by the dimensions of the I/O ring around its perimeter. Since Tonga occupies almost the same area as Tahiti, it’s got to have room to accommodate a 384-bit GDDR5 interface. Surely we’ll see a Radeon R9 285X card eventually with a fully-enabled Tonga GPU clocked at 1GHz or better. If I were betting, I’d put my money on that card having a 384-bit path to memory.

Game Testing the AMD Radeon R9 285

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Our test systems were configured like so:

Processor Core i7-3820
Motherboard Gigabyte
X79-UD3
Chipset Intel X79
Express
Memory size 16GB (4 DIMMs)
Memory type Corsair
Vengeance CMZ16GX3M4X1600C9
DDR3 SDRAM at 1600MHz
Memory timings 9-9-9-24
1T
Chipset drivers INF update
9.2.3.1023
Rapid Storage Technology Enterprise 3.6.0.1093
Audio Integrated
X79/ALC898
with Realtek 6.0.1.7071 drivers
Hard drive Kingston
HyperX 480GB SATA
Power supply Corsair
AX850
OS Windows
8.1 Pro
Driver
revision
GPU
base
core clock
(MHz)
GPU
boost
clock
(MHz)
Memory
clock
(MHz)
Memory
size
(GB)
Radeon
HD 7950 Boost
Catalyst 14.7 beta
2
925 1250 3072
Radeon
R9 285
Catalyst 14.7 beta
2
973 1375 2048
Radeon
R9 280X
Catalyst 14.7 beta
2
1000 1500 3072
GeForce
GTX 760
GeForce
340.52
980 1033 1502 2048
GeForce
GTX 770
GeForce
340.52
1046 1085 1753 2048

Thanks to Intel, Corsair, Kingston, and Gigabyte for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Also, our FCAT video capture and analysis rig has some pretty demanding storage requirements. For it, Corsair has provided four 256GB Neutron SSDs, which we’ve assembled into a RAID 0 array for our primary capture storage device. When that array fills up, we copy the captured videos to our RAID 1 array, comprised of a pair of 4TB Black hard drives provided by WD.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Watch Dogs


Click on the buttons above to cycle through plots of the frame times from one of our three test runs for each graphics card.

Yep, Tonga’s color compression magic translates pretty well into performance in real games. The R9 285 effortlessly outperforms the AMD Radeon HD 7590 Boost—which, again, is essentially the same thing as the Radeon R9 280. (Sorry, didn’t realize until too late that Damage Labs didn’t have an R9 280 on hand, so you’ll have to settle for a different name on the graph labels.)

The newest Radeon also outdoes the GeForce GTX 760, its closest competitor, and that card’s bigger brother, the GTX 770.


These “time spent beyond X” graphs are meant to show “badness,” those instances where animation my be less than fluid—or at least less than perfect. The 50-ms threshold is the most notable one, since it corresponds to a 20-FPS average. We figure if you’re not rendering any faster than 20 FPS, even for a moment, then the user is likely to perceive a slowdown. 33 ms correlates to 30 FPS or a 30Hz refresh rate. Go beyond that with vsync on, and you’re into the bad voodoo of quantization slowdowns. And 16.7 ms correlates to 60 FPS, that golden mark that we’d like to achieve (or surpass) for each and every frame.

Not only does the R9 285 perform well it terms of FPS averages, but it also generates new frames with consistent quickness. The new Radeon barely ever surpasses our 33-ms badness threshold. In other words, the frame rate barely ever drops below 30 FPS, even for an instant. For that reason, Watch_Dogs is quite playable on the R9 285 at 2650×1440. In fact, that’s the display resolution AMD has identified as a good target for this card.

Crysis 3


Nvidia has done some nice work with its graphics drivers in the past year or so, cutting out cases where games slow down for whatever reason. That work pays off here, as you can see in the plots, in the two places where I fire exploding arrows at the bad guys during our test run. The Radeons have a couple of big frame-time spikes in their plots, while the GeForces don’t.

Frame times are a little less variable overall on the GeForce cards, and that reflects in the 99th percentile results. Even though the R9 285 produces a slightly higher FPS average than the GeForce GTX 760, the GTX 760 comes out at ahead in our more time-sensitive metric.

The frame time curve reveals that the R9 285 generally outperform the GTX 760, but the GeForce is quicker in the most difficult two to three percent of frames.


My sense is that the frame time spikes and “badness” we see from the Radeons here is something AMD needs to fix with driver optimization. I doubt it’s a reflection on the underlying GPU tech. That said, it does reflect on the quality of the experience gamers will have with the product.

Borderlands 2

Since the last couple of games were a bit challenging for this class of GPU at 2560×1440, I thought I’d include my favorite game as an example of how well these cards can handle a fairly typical Unreal Engine 3-based title.


Wow. That Tonga devil magic does work wonders. The Radeon R9 285’s frame time plots illustrate how very consistently the GPU produces frames in this test scenario. Although it doesn’t produce the most total frames (and thus doesn’t have the highest FPS average,) the Radeon R9 285 takes the top spot in our 99th percentile frame time metric.


No matter which of our metrics you use, the AMD Radeon R9 285 handles Borderlands 2 flawlessly, ahead of both the GTX 760 and the 770. That latter card is based on a full-blown GK104 graphics processor. In our time-sensitive metrics, the R9 285 also beats the full-fledged Tahiti card, the Radeon R9 280X.

 

 

Thief

I decided to just use Thief‘s built-in automated benchmark, since we can’t measure performance with AMD’s Mantle API using Fraps. Unfortunately, this benchmark is pretty simplistic, with only FPS average and minimum numbers (as well as a maximum, for all that’s worth.)

Chalk up another shocking win for Tonga. The R9 285 beats both the R9 280X and the GeForce GTX 770 in the Thief benchmark. Good grief.

Notice that the R9 285 doesn’t fare as well with AMD’s close-to-the-metal Mantle API as it does with the game’s default Direct3D mode. By contrast, the Tahiti-based Radeon HD 7950 benefits a bit from the switch to Mantle. Looks to me like Mantle support for the R9 285 may not quite be ready for prime time.

 

 

Power consumption

Please note that our “under load” tests aren’t conducted in an absolute peak scenario. Instead, we have the cards running a real game, Crysis 3, in order to show us power draw with a more typical workload.

AMD hinted that the R9 285 ought to be more power-efficient than its older Tahiti-based graphics cards. In our tests, under load, that’s not the case. I suspect the cause of the disconnect here may be the fact that MSI has pushed the frequency up—and likely the voltage, as well—on this Radeon R9 285 Gaming OC Edition card. Given the performance that we’ve seen out of this card, though, its power efficiency seems pretty reasonable.

Noise levels and GPU temperatures

 

This dual-fan MSI cooler is scary good. As I tested, I thought subjectively that the R9 285 card got quieter under load than it was at idle. Then, uh, the meter confirmed it.

MSI has been very aggressive in choosing a 65°C tuning point for its R9 285 card. With the new PowerTune, AMD has typically exploited quite bit more temperature headroom. In truth, though, the R9 285 doesn’t seem to need it.

 

 

Final Conclusions and Closing Thoughts of the AMD Radeon R9 295

I haven’t yet had time to test the AMD Radeon R9 285 as extensively as I’d like. I had to throw together this review in a few days after finishing up the Core i7-5960X, and there just wasn’t time to test more games or display resolutions.

That said, I think we have a sense of how surprisingly potent this video card really is, thanks to a GPU that’s packed with more innovation than we expected. Here’s one more reminder of how wildly this card is overachieving given its memory bandwidth. We’re averaging together the results from all of our game tests here.

Compared to the Radeon HD 7950, the R9 285 is a big move up and to the left, in the direction of goodness. The R9 285 easily makes the most effective use of its memory bandwidth of any of the cards we tested. Like I said earlier, this is generational change on the GPU front.

Costs

Our famous value scatter plot tells the rest of the story. In our limited tests, the R9 285 looks to be fast enough to justify its $249 price tag pretty well. I suppose that’s what AMD was thinking, huh? The GeForce GTX 770 sure looks like a raw deal by comparison.

This MSI Gaming OC Edition card is a stone cold killer, too. I’m smitten. MSI has nailed it with this cooler and the board’s overall design. Consider also that AMD is throwing in a trio of games via its Never Settle bundle with the purchase of an R9 285, and you might be tempted to order one right away. I can’t say I’d blame you. R9 285 cards are supposed to be on store shelves today, and AMD tells us it expects an ample initial supply. Most of them will have 2GB of memory, but there will be 4GB variants coming, as well.

Only thing is, I’m pretty sure this is the opening salvo of a protracted battle. If I’m right about Tonga having a 384-bit interface, then the AMD Radeon R9 285X could turn out to be quite the thrill ride. We don’t yet know exactly what Nvidia has cooking, either, but it may well be a Maxwell variant that’s direct competition for Tonga. I suspect we’ll have more reasons to test this magical new GPU in the coming weeks.

Many of my sentences are compressed by 40% in order to fit on Twitter.

The post Full Review of AMD Radeon R9 285 graphics card appeared first on The Tech Report.

]]>
The 5 Best Smart Beds of All Time: Costs and Benefits https://techreport.com/review/best-smart-beds/ Tue, 19 Apr 2022 14:40:39 +0000 https://techreport.com/?p=3476991 the 5 best smart beds of all time

Did you know that the average human will spend almost 26 years of their life sleeping? The innovation of smart beds has changed the sleep game, and the best smart...

The post The 5 Best Smart Beds of All Time: Costs and Benefits appeared first on The Tech Report.

]]>
the 5 best smart beds of all time

Did you know that the average human will spend almost 26 years of their life sleeping? The innovation of smart beds has changed the sleep game, and the best smart beds can make a world of difference in how you sleep.

In an increasingly fast-paced world, it is more important than ever before to make sure that you get plenty of rest. Making the most of your rest time is an essential key to making the most of your day. The best way to ensure that you’re maximizing the potential of your sleep is to invest in a great bed.

Smart Beds

In the increasingly digital world, innovation and technology are everywhere. Because of this surge in technology, it is becoming more common to implement tech software into other products.

One of the most popular new tech products is that of smart beds. Smart beds use sensor technology to gather data about your sleep cycle. They then use this data to self adjust and optimize settings to give you the best sleep possible. Here are five manufacturers to shortlist if you choose to make this investment.

Saatva Solaire Smart Beds

The selection for the 5th best smart bed of all time comes from Saatva Solaire. This mattress is a great option for those with back pain or a larger frame.

The Saatva Solaire is an innovative airbed with multiple layers of mixed materials. The mattress utilizes a cotton material to ensure air circulation to keep the mattress cool. Below the surface is a gel-infused memory foam that works to target and support pressure points during sleep. An interior support core made of air canisters works to ensure even air distribution across the mattress surface.

Solaire also offers a flex support option that allows each side of the bed to operate separately. This allows for each person to adjust their sleeping firmness accordingly.

The major problem that keeps Saatva Solaire Smart Beds at No. 5 on the list is the price point. The pricing ranges between $2197 and $4495, and while this pricing is not unrealistic, it is definitely a pricey investment. The combination of unique mattress material and technology makes Saatva Solaire an easy choice for one of the best smart beds.

Eight Sleep Pod Pro Mattress

Number 4 on our best smart beds list comes from the company Eight Sleep. The Eight Sleep Pod Pro touts temperature control as its top feature.

The mattress features an automated heating and cooling system and can regulate temperatures of 55-110 degrees Fahrenheit. The Pod Pro also offers a split temperature system so that each side of the bed can control its own sleep temperature. Because of the sensor technology below the surface, the Pod Pro can provide health reports and insights.

The mattress also features a GentleRise alarm system and advanced sleep tracking. These features are designed to not only maximize the value of your sleep, but also to provide a peaceful wakeup.

The GentleRise alarm monitors the data collected by the mattress to wake you during the lightest part of your sleep cycle. Pricing for the Eight Pod ranges from $2995 to $3395 based on mattress size. The top-of-the-line temperature system and reasonable pricing makes the Pod Pro one of the best smart beds on the market.

Sleep Number 360 C2

No. 3 on our list of best smart beds is the Sleep Number 360 C2. The C2 doesn’t offer as many features as some of the other options on the list, but it comes in at No. 3 because of its affordable pricing. While most of the entries on this list cost upwards of $2000, the C2 mattress starts as low as $699.

The Sleep Number C2 offers adjustable mattress firmness and can be customized for each bedside. The mattress also adjusts its firmness as you sleep and provides data on your Sleep IQ. Mattress materials stretch and adjust to support you while you sleep to ensure a productive night’s sleep.

Because of the affordable pricing, the Sleep Number 360 C2 is an easy choice for one of the best smart beds.

ReST Original Smart Bed

Coming in at No. 2 on our list is the ReST Original Smart Bed. The ReST smart bed offers arguably the best features of any competitor on the market.

Like most of the other competitors, the ReST smart bed offers side customization. Unlike the competitors, however, ReST offers total customization of five different ergonomic zones. This means that you can adjust the bed to different settings for various parts of your body, and truly maximize your sleep. The mattress will automatically respond and adjust as you sleep and move to ensure maximum comfort.

The primary downside to the ReST Original Smart Bed is the price. The bed sells for pricing starting at $4274. While the price is steep, ReST is extremely confident in their product, and they offer a 10-year warranty. The top-of-the-line features and materials make this one of the top smart beds available.

Ghost SmartBed

The No. 1 smart bed is arguably the Ghost SmartBed. The Ghost SmartBed has top-of-the-line sensor technology allowing it to adjust quickly to your needs as you sleep. It also has customizable air chambers to control pressure, and a gel polymer layer adapts to your body’s pressure points and temperature.

New smart technology allows you to adjust the mattress from your phone and sends updates on your sleep regularly.

Ghost utilizes top-of-the-line materials to ensure comfortable sleep every time. This combination of high-quality materials and smart technology does lead to a high price point of $6999. Although expensive, the quality features and reliability make the Ghost SmartBed one of the best smart beds available.

The post The 5 Best Smart Beds of All Time: Costs and Benefits appeared first on The Tech Report.

]]>
BEYOU Chair: The Office Chair That Fits You https://techreport.com/review/beyou-chair/ Wed, 12 Jan 2022 13:05:36 +0000 https://techreport.com/?p=3475530 BEYOU is a revolutionary new office seating device. It's now possible to do so much more with your morphing chair than just sit.

What is the BEYOU Chair? The BEYOU Chair is a revolutionary new office seating device. It’s now possible to do so much more with your morphing chair than just sit....

The post BEYOU Chair: The Office Chair That Fits You appeared first on The Tech Report.

]]>
BEYOU is a revolutionary new office seating device. It's now possible to do so much more with your morphing chair than just sit.

What is the BEYOU Chair?

The BEYOU Chair is a revolutionary new office seating device. It’s now possible to do so much more with your morphing chair than just sit.

Suppose you want to work on your laptop or play on your phone. You might relax in the lounge or have a calm moment to yourself. Even practice some yoga positions or do some power napping. Whatever you do, you can now do it in the BEYOU chair. It might be better than a glass of wine after a hard day. It’s more adaptable than a Swiss army knife in the woods.

Wings that help you lounge and relax.

The BEYOU wings are really simple to change. All it takes is a single click. You can accomplish all of this while sitting down. Open them all the way up, all the way down, or wherever in between. You do whatever suits your needs.

Each wing may be adjusted to five different places. Use them as armrests, backrests, or knee supports. In addition, they are a great place to prop your feet up. You can even use them as yoga blocks. You won’t believe how versatile and helpful this office chair can be.

Usable both as a seat and a table.

The backrest may be used as a laptop stand or a convenient lunch table. Furthermore, you can turn it into the world’s most comfortable elbow pad. It’s small enough to fit beneath your desk. In addition, it makes typing on your keyboard a lot nicer.

Additionally, you may sit comfortably with your legs crossed on a chair created for you by using the backrest as a seat. Play around with your knees or feet on the seat to see what feels nice.

The wings on the BEYOU chair are velvety and comfortable, kind of like Grandpa’s favorite recliner, yet they’re also sleek and contemporary.

For further back support, adjust the adjustable backrest up or down. It may be moved backward or forwards, up or down, and even slightly tilted. When it’s horizontal, it opens up a whole new world of possibilities for sitting, relaxing, and working. BEYOU is comfortable (or can be made so) as the backrest adjusts in all the appropriate places.

The BEYOU Chair Costs and Discounts

Currently, you can join the project, place an advance order, and receive the BEYOU chair for a fraction of the normal price. Join other backers to form an incredible community that is actively making a difference in the world.

Because of the large number of pre-orders, they are able to make bulk purchases for all of the chair’s components. This results in higher profits for all suppliers. Larger orders are prioritized by suppliers. In addition, they are treated with greater attention. This means fewer glitches and a greater focus on detail.

You may be skeptical of this chair when you first look at it. That’s a normal response, but take a moment and look up the video. Once you’ve seen the product, it’s time to consider whether a seating apparatus like this is acceptable for your, ahem, professional demands.

Transition of Seat State

The Current State of Your Seat

Let’s face it: normal office chairs are awful. Even the most high-end ones make you feel like you’re at work. This is no way to live your #bestlife for the typical person who works about 10 hours every day.

A normal office chair, according to the World Economic Forum, might cause a slew of health problems that you should avoid. Sitting idle, in an upright position all day, and not getting enough activity are all linked to heart disease, cancer, diabetes, and strokes. This revelation should be enough to jolt you off of your current perch.

The New State of Your Seat

BEYOU is an office chair that adapts to you. In other words, you modify it into a pattern that corresponds to your current mood and seating preference. Your body signals you to modify your posture depending on pressure areas. This happens much like when you’re lying down.

Moving about when sleeping is a natural occurrence, so why not do the same at your desk when you’re awake? The BEYOU office chair has five possible settings for each wing. Only your imagination and what is socially acceptable among your peers restrict your variety. But, hold on a second, you work from home? Feel free to bend this into whichever shape you like.

And now…the state of your pocketbook.

Launch pricing, like many other Kickstarter campaigns, gives a significant reduction off the item’s future retail price.

You can get in on the “early bird bargain” for $349. This represents 42 percent off the suggested retail price once this item becomes mainstream.

The post BEYOU Chair: The Office Chair That Fits You appeared first on The Tech Report.

]]>
GMAX Ultra – The Scooter You’ve Been Dreaming About https://techreport.com/review/gmax-ultra-the-scooter-youve-been-dreaming-about/ https://techreport.com/review/gmax-ultra-the-scooter-youve-been-dreaming-about/#respond Thu, 29 Apr 2021 13:00:15 +0000 https://techreport.com/?p=3474133

The GMAX Ultra Electric Scooter is the scooter you’ve been dreaming about. A seriously cool find. The GMAX has a great ride, it’s a heavy-duty piece of equipment, and the...

The post GMAX Ultra – The Scooter You’ve Been Dreaming About appeared first on The Tech Report.

]]>

The GMAX Ultra Electric Scooter is the scooter you’ve been dreaming about. A seriously cool find. The GMAX has a great ride, it’s a heavy-duty piece of equipment, and the price is just about right. 

The speed goes up to about 20 mph — even uphill — which I was concerned about. Sometimes scooters or bikes poop out going uphill. But, not this little beauty. The GMAX zooms uphill like nothing else, even with my “a little hefty” friends and those whom, “COVID did it” weight still lingering. Oh, these friends loved taking a ride on my GMAX scooter.

The Scooter You’ve Been Dreaming About

GMAX Ultra Electric Scooter
The Scooter You’ve Been Dreaming About, GMAX

I was worried because everyone in the whole neighborhood wants to take a ride on this great scooter. The scooter takes the hefty’s just fine and isn’t too fast for everyone else. The scooter itself is for an adult — not a child — though I have given a few kids a short dig on it with the kid standing in front of me on the base (there is room). 

I also have the last model of the GMAX scooter and love it — but the newest model runs more smoothly. The tires are wider, which I like from an aesthetic point of view, as well as providing a smoother riding. This GMAX goes quite a bit faster (20 mph) with the 350 Watt Motor than my other GMAX that’s a couple of years old. This GMAX Ultra has high torque — but doesn’t suck the power (you can pretty much ride all day).

We have been taking turns riding the GMAX to the office — I really didn’t want to share my scooter too much, but I really understand the draw.

I also wanted to get other opinions on this scooter for this review. We have “hardcore” athletic peeps in the office that have every type of sports equipment available. I know I love this scooter — but I loved receiving the confirmation from the team as well. 

I named my scooter “Maxie-Boy” — but I name all of my equipment because I feel personal about it! Maxie-Boy is my scooter, and I finally wrote my name on it in a Sharpie pen so no one would forget it is mine! Someone asked me why I didn’t name it “Maxie-Man” because it easily holds a large person rider, and does “manly” work. LOL — yes — why indeed?

Why the Update in the GMAX Electric Scooter?

The GMAX company worked on customer feedback for over a year and then went to work with these upgrades. Yes, the upgrades are noticeable — and we appreciate the work of the GMAX company to bring about a superior product. I really like the new display and the bright headlight. 

These are the Specs 

Some people really want to know the specs. You can find these on the site, but they are listed below —  I just looked for a smooth ride and quality — GMAX has both. 

New digital display, digital lock, improved headlight, and reduced motor noise. 36V 17.5aH LG battery 350 Watt Avg Motor 45 Miles per charge 20 Mph 10″ Air-filled tires Size – 42.3″ x 17.9″ x 47.4″ Weight – 36LBS Max Rider Weight – 220LBS Waterproof Rating – IP54

GMAX Ultra Electric Scooter
The GMAX Ultra Electric Scooter

Where to Buy

This scooter has the new G-Series design. I think the best place to purchase the scooter is to watch the GMAX website for a sale (sales do happen, and this is what I did). 

I like to watch these scooters too.

What I Like 

The GMAX Ultra Electric Scooter is fairly budget-friendly if you’ve looked around or purchased electric scooters before — you’ll see this price point. You can use your scooter for a long-range ride, and it is built tough enough that you can ride it to work every day. 

The post GMAX Ultra – The Scooter You’ve Been Dreaming About appeared first on The Tech Report.

]]>
https://techreport.com/review/gmax-ultra-the-scooter-youve-been-dreaming-about/feed/ 0
The Best Smart Mattresses of 2021 https://techreport.com/review/best-smart-mattresses/ Thu, 17 Dec 2020 22:20:26 +0000 https://techreport.com/?p=3473608 Today, technology makes many aspects of daily life easy and convenient. We use smartphones, tablets, and virtual assistants to shop, pay bills, and manage our to-do list. Now, technology can...

The post The Best Smart Mattresses of 2021 appeared first on The Tech Report.

]]>
Today, technology makes many aspects of daily life easy and convenient. We use smartphones, tablets, and virtual assistants to shop, pay bills, and manage our to-do list. Now, technology can also help us find a better night’s sleep.

With many mattress brands using innovative, “smart” technologies and advanced materials, it is easier than ever to find deeper, more restful sleep. Plus, some of these beds come with sleep tracking sensors that provide a better understanding of your sleep habits so you can pinpoint areas for improvement.

Throughout this article, we outline some of the best smart mattresses of 2021. We also break down the various technologies, materials, and features you can expect to find when shopping for a smart bed.

Best Smart Mattresses of 2021

Best Smart Mattress Overall – Amerisleep AS3

Amerisleep AS3

When it comes to creating unique, advanced mattress materials, Amerisleep is in a league of their own. Each of their beds features Bio-Pur®—a plant-based foam that contours like traditional memory foam but without the uncomfortable sinkage and heat retention that often comes with petroleum-based mattress foams. Plus, Bio-Pur® is non-toxic, environmentally friendly, and provides superior pressure relief and comfort.

From their line of eight different mattresses, each designed to suit a specific sleep style, we recommend the Amerisleep AS3 as the best smart mattress overall. This mattress has a medium firmness and provides a balance of support and contouring that works well for many different sleepers, including side, back, and combination sleepers.

The AS3 has a 3-inch Bio-Pur® comfort layer, which hugs the body’s curves, ensuring joints are cushioned and protected from pressure points. Bio-Pur’s® responsiveness also gently lifts hips to prevent spinal misalignment, so you wake with less pain. This contouring comfort layer helps you find a weightless sleep position, so muscles can relax and heal during sleep.

Beneath this layer, Amerisleep includes their Affinity foam with HIVE® technology. Hundreds of hexagonal cutouts respond to different levels of pressure. Near sensitive joints, cutouts are springy and compress easily to keep these areas cushioned. Along the back, cutouts are firm to ensure the lumbar spine has adequate support. HIVE® technology also helps increase breathability because warm air can quickly escape through each hexagonal cutout.

The base of the AS3 is made with Amerisleep’s Bio-Core® foam. This material is strong and durable enough to maintain the mattress’s supportive shape and structure throughout decades of use.

All three layers of the AS3 are backed by a CertiPUR-US® certification, ensuring they are free of harmful chemicals. The mattress is also covered in a soft, breathable material. This cover is removable and machine washable, making it easy to keep the mattress clean and germ-free.

You can make this mattress even more high-tech with one of Amerisleep’s adjustable bases, the Adjustable Bed or the Adjustable Bed+. These bed frames offer head and leg articulation—improving circulation, digestion, and alleviating symptoms of sleep apnea.

The durability of the Amerisleep’s AS3 is backed by an industry-leading 20-year warranty covering structural defects, such as sags and indents greater than .75 inches. You can experience the comfort of this mattress for yourself with Amerisleep’s 100-night sleep trial. If you are not happy with the bed, Amerisleep will arrange for a pick-up and provide a full refund.

Best Smart Memory Foam Mattress – Zoma Mattress

Zoma Memory Foam Mattress

Although the layers of the Zoma Mattress may not seem high tech at first glance, they each feature cutting-edge materials to provide better quality sleep. The pressure relief and conformability of this mattress are so impressive that it has recently become a favorite of many professional athletes.

The Zoma has a stretch-knit cover with ventilation channels to prevent heat build-up. This cover’s flexibility allows sleepers to experience the full contouring and support of the gel memory foam top layer.

The Zoma’s comfort foam is infused with a cooling gel to draw heat and moisture away from the surface of the bed, allowing you to sleep cool and undisturbed. While the gel-infusion works to keep you cool, Zoma’s Triangulex™ technology, arranged in three distinct zones, works to relieve tension and promote deep relaxation.

To encourage a safe, neutral spine, Zoma includes their Reactiv™ transition layer in its namesake mattress. This foam has a latex-like bounce to minimize sinking and keep the hips and shoulders lifted and aligned. The combination of spinal support and pressure relief is perfect for athletes and those with an active lifestyle because it promotes a healthy spinal position.

The base of this mattress features Zoma’s Support+, a durable foam that protects the upper layers from sags and indentations. This stable base ensures the mattress technology lifts and cradles the body for a perfect night’s sleep.

You can also pair this memory foam mattress with the Zoma Adjustable Bed Frame, a customizable base that lets you lift the head and legs for an even more comfortable sleep position.

If you are interested in testing out the Zoma Mattress, you can take advantage of their 100-night risk-free sleep trial. This extended trial period allows you to make sure the mattress’s comfort and support are right for your body type and sleep style. It also comes with a 10-year full replacement warranty.

Best Smart Hybrid Mattress – Vaya Hybrid

Vaya Hybrid Mattress

Vaya is a newer mattress brand producing a state of the art hybrid bed with two foam layers and an advanced spring coil base. Vaya’s unique foam is manufactured to have a slight elasticity, similar to latex foam. While this material is soft and conforming, it won’t create excess sinking, allowing the spine to rest in a safe, neutral position.

Vaya’s CertiPUR-US® certified foams, ventilated coils, and construction method produce a mattress that naturally retains less heat and moisture than traditional memory foam varieties, helping sleepers remain cool and comfortable throughout the night. Plus, the soft, airy cover surrounding the mattress is perforated to prevent body heat from building up in the bed.

The Vaya Hybrid offers sleepers the perfect balance of comfort and support with a buoyancy similar to an innerspring mattress. This is due to the spring coil support base, which provides even weight distribution, so sleepers are never forced into awkward sleep positions that put pressure on the spine. These coils are individually wrapped and move independently for adequate contouring and pressure relief. Reinforced edge support around the perimeter of the mattress also provides better motion isolation. A firm layer of foam rests beneath the spring coils for shock absorption and increased durability.

The layers of the Vaya Hybrid have a medium comfort level, which cushions the joints and gently lifts the hips for proper alignment. Side, back, and combination sleepers will enjoy the advanced spinal support and pressure relief the Vaya Hybrid offers.

Each mattress comes with a 10-year warranty to cover sags and indents greater than .75 inches. It also covers broken or bent coils that could change the structure of the mattress. Vaya includes a 100-night sleep trial, giving sleepers enough time to sleep on the bed and make sure it meets their needs.

Best Sleep Tracking Smart Mattress – Eight Sleep’s The Pod Pro

Eight Sleep's The Pod Pro

The Eight Sleep Pod Pro offers customizable temperature control on each side of the bed—perfect for couples with different preferences. Eight Sleep’s hydro pump heats and cools water through thermo-electric elements. Hot or cold water then flows from the pump to the Active Grid™ to change the temperature on the surface of the bed. The Active Grid™ is also Wi-Fi enabled, allowing users to set their preferred temperature with Eight Sleep’s smartphone application.

In addition to controlling temperature, the Active Grid™ is also a soft, flexible material that molds to the body’s curves to relieve tension. Ambient sensors across the surface of this grid measure the temperature and humidity in the room. Using this information, the grid automatically adjusts the bed to the user’s preferred temperature.

The sensors in the Active Grid™ monitor various biometrics, including heart rate and breathing rate. These sensors act as sleep trackers to provide feedback on your sleep cycle, duration, and disruptions. A pre-programmed smart alarm can even wake you up once you have reached a lighter stage of sleep.

In addition to this advanced technology, the Pod Pro has five foam layers, giving it a total height of 12 inches. At the top of the mattress, the layers are soft and compress easily with pressure. Near the base, the foam is firm to prevent structural changes such as indentations.

The Pod Pro comes with a 100-night sleep trial, giving sleepers plenty of time to make sure the mattress meets their needs. The Pod Pro limited warranty covers the foam layers for 10 years and the technology, including the Hub and Active Grid™, for 2 years.

Best Adjustable Smart Mattress – Kingsdown Sleep Smart Air

Kingsdown Sleep Smart Air

The Kingsdown Sleep Smart Air is a hybrid mattress with a combination of pressure-relieving foam and individually wrapped spring coils. The layers of this bed are covered in a performance fabric with thermic technology. This material absorbs and dissipates heat to create a cool and inviting atmosphere.

For even more breathability, a ventilated edge support system made of high-density foam surrounds the perimeter of the mattress. Airflow channels in this foam increase air circulation while also reducing motion transfer.

The Kingsdown comes with its IntelliMax 3 zone support system. This technology consists of an air chamber in the middle of the bed that users can adjust to change the firmness and level of support. An electric motor outside the bed can inflate or deflate the air in 3 body zones: the head and neck, lumbar and hips, and legs and feet. This system gives couples complete control over the support on each side of the bed.

If you are unsure of your preferred firmness level, the mattress’s diagnostic system can find the most supportive level for you. When resting on the bed, the IntelliMax system will change each zone’s firmness to suit your body type and position.

Kingsdown’s 10-year warranty protects the layers of the mattress, while a 5-year warranty covers the technology. You can also pair the Sleep Smart Air with Kingsdown’s adjustable base for even more comfort and customization.

What is a Smart Mattress?

A smart mattress features advanced technologies and materials to help customers find a better night’s sleep. The components of most smart beds are cutting-edge and are not found in most traditional mattresses. These technologies include sleep tracking sensors, automatic firmness adjustments, and unique foam layers that provide temperature control and spinal alignment.

Features of a Smart Mattress

Below, we explain the purpose and intention behind hi-tech add-ons and common smart bed features.

  • Tracking Sensors: Sensors are placed in the cover of many smart mattresses to track sleep habits and vital signs throughout the night. Sleep data, such as how much time you spent in REM, how often you changed position, and how often your sleep was disrupted, are sent to your smartphone each morning. This information allows sleepers to find areas for improvement and make adjustments to their sleep habits.
  • Temperature Control: Many smart mattresses have innovative foam layers, such as plant-based and gel-infused foam that regulate temperature. Additionally, some smart beds automatically adjust to a preset temperature using a hydro pump or electric motor.
  • Customization: Smart mattresses with built-in air chambers allow you to adjust the bed’s firmness and support. These changes are typically done with a remote control or smartphone application.
  • Automatic Adjustments: Sensors in the top layer of many smart mattresses allow for automatic adjustments to the temperature and firmness of the bed. The firmness will change to keep you supported as you switch positions throughout the night, while temperature automatically adjusts to maintain your preferred setting.
  • Alarms: Built-in alarms can gently wake sleepers with an audio message or gentle vibrations, and some alarms can be preset to wake you once you have entered the lightest stage of sleep.
  • Connectivity: Many smart mattresses are adjusted using a smartphone application; therefore, many smart beds have Wi-Fi and Bluetooth connectivity. This feature allows you to connect the bed to your phone or a virtual assistant like Amazon Alexa or Google Home.

What to Consider When Buying a Smart Mattress

Mattresses are not one size fits all. What feels most comfortable for side sleepers may not be right for those who prefer back sleeping. Additionally, while smart features like alarms and Wi-Fi connectivity are great, they won’t necessarily make the bed more comfortable. Therefore, it is important to look at the materials and firmness of the mattress before purchasing.

The following points can help you find a smart mattress that is durable, affordable, and meets your specific needs.

Mattress Firmness

Your body type and sleep style often determine the mattress firmness that is most comfortable and supportive for you. While some smart beds have an adjustable firmness, this feature will not change the mattress’s overall feel. Therefore, it is important to understand the comfort level for each mattress you consider.

A scale from 1 to 10, with 1 being the softest and 10 being the firmest, is often used to describe a mattress’s firmness. The chart below indicates the body type and sleep positions most suitable for each firmness.

Mattress Firmness Sleep Position and Body Type
Soft mattresses (1 to 2) Lightweight sleepers; ultra-soft and compresses with very little pressure
Medium-Soft mattresses (3 to 4) Side sleepers; cushions the joints and provides a slight firmness to prevent excess sinking
Medium mattresses (5) Provides a balance of cushioning and support; suitable for side, back, and combination sleepers, as well as average weight individuals
Medium-Firm Mattresses (6 to 7) Back sleepers and heavier individuals; offers a slight softness to reduce pressure points and firmness to prevent spinal misalignment
Firm Mattresses (8 to 10) Stomach sleepers; extra firm to keep the hips lifted and reduce pressure on the spine

Mattress Materials

Most smart mattresses are made with various types of foam or a combination of foam and wrapped spring coils. Below, we explain the pros and cons of these materials.

Memory Foam

Memory foam is known for its unique responsiveness. When you rest on this material, it immediately molds to the body to ensure each area is cushioned and supported so muscles can fully relax. Memory foam is available in several firmness levels, including ultra-soft, medium-soft, medium, medium-firm, and firm. The versatility of this material makes it suitable for most sleep styles and body types. The deep contouring and pressure relief of memory foam are also ideal for those with arthritis and other types of chronic joint pain.

Traditional memory foam tends to retain heat and moisture throughout the night, leading to sleep disruptions, such as night sweats. However, many smart mattresses use new, more advanced memory foam varieties made with plant-based oils and gel-infusions to combat heat retention.

Most memory foam mattresses have a durable support foam in the base to prevent sagging and indentations.

Individually Wrapped Spring Coils

Some smart mattresses are made with a combination of foam and spring coils, called a hybrid. While the memory foam top layer provides comfort and pressure relief, the spring coils offer a slight bounce. Heavier individuals and back sleepers often prefer a hybrid mattress to keep body weight evenly distributed and avoid uncomfortable sinkage.

Unlike traditional spring coils, the coils used in most smart beds are individually wrapped to provide more accurate contouring and maximum comfort. Instead of moving as a unit, these coils move independently of one another to reduce motion transfer.

Temperature Regulation

Smart beds provide more temperature regulation than traditional mattresses, which is perfect for those who sleep hot. Manufacturers achieve this by using temperature neutral foams, such as plant-based foams or those with a gel infusion.

Some smart mattresses also have ambient sensors to monitor your bedroom’s climate and automatically adjust the mattress to a preset temperature. Smart beds with dual controls give couples the ability to set a different temperature on each side of the bed.

Smart Features

Smart features such as sleep tracking, biometric monitoring, customization, and Wi-Fi connectivity can cause the mattress price to increase. Therefore, be sure these features are important to you and make use of them with ease. There is no sense in spending extra money on features that are inconvenient to use or those that don’t fit into your lifestyle. After all, many smart features do not change the bed’s overall feel or improve sleep quality.

Price

Smart mattresses often come with a higher price point. The more smart features a bed has, the higher the price will be. Therefore, it is important to have a set budget in mind when shopping for a smart bed. It can be easy to get caught up in the novelty of these smart features and spend too much money.

Most high-quality smart mattresses cost between $1000 and $2000 for a queen size. However, those with an adjustable firmness level can cost upwards of $4000.

Sleep Trial

Even though many smart beds come with customizable features, it is important to test out a new mattress to make sure it is right for you. Most high-quality smart beds come with a risk-free sleep trial period, typically between 30 and 100 days.

Warranty

A 10-year warranty is fairly standard in the mattress industry. Most reputable brands will repair or replace a mattress that develops sagging or indentations below .75 inches during the first 10 years of use.

As you shop for a smart mattress, be sure to determine the warranty period that applies to the mattress itself and the technology in the bed. Smart beds with sensors, air chambers, and temperature controls often come with a shorter warranty for the bed’s technical components and a longer one for the mattress.

Frequently Asked Questions

What sizes are smart mattresses available in?

Most smart beds are available in standard U.S. mattress sizes, including twin, twin XL, full, queen, king, and California king. However, some smart beds, particularly those with tracking sensors and an adjustable firmness, may only be available in full, queen, king, and California king.

How does sleep affect my work?

Sleep deprivation negatively impacts our cognitive function, mental clarity, memory, and focus, which, in turn, affects our work performance. Whether you work remotely or from an office, a good night’s sleep is vital to productivity and work quality.

Can you use a regular mattress on an adjustable bed?

Yes. Most mattresses can be paired with an adjustable bed. All foam beds and many hybrid mattresses tend to work best on an adjustable frame because they are flexible enough to bend without becoming damaged. Airbeds with an adjustable firmness should not be used on an adjustable base since the air layer can become permanently damaged when bent.

Are adjustable beds good for your back?

Yes. Adjustable beds with head and leg articulation provide targeted support to the lumbar spine, easing muscle tension and pain. When the legs are lifted slightly, the lower back muscles can fully relax, allowing for better recovery and deeper sleep.

What is the difference between a hospital bed and an adjustable bed?

Hospital beds and adjustable beds are similar. They both offer head and foot articulation, allowing sleepers to find the most comfortable sleep position possible. However, many modern adjustable beds have more advanced features, such as Wi-Fi and Bluetooth connectivity, USB ports, built-in massagers, and under-bed lighting.

Conclusion

With the huge advancements in technology over the past 15 to 20 years, it is no surprise that smart tech can now help us create a more comfortable sleep space. However, it is important to remember that the most valuable “smart” features are ones that actually enhance your mattress’s comfort. Sleep tracking sensors and heart rate monitors are great, but they will not change the bed’s overall feel. Therefore, be sure to select a mattress that is supportive and comfortable for you, at a price that makes sense.

The post The Best Smart Mattresses of 2021 appeared first on The Tech Report.

]]>
Molekule Air Pro: Taking Out Pollutants at the Molecular Level https://techreport.com/review/molekule-air-pro-taking-out-pollutants-at-the-molecular-level/ https://techreport.com/review/molekule-air-pro-taking-out-pollutants-at-the-molecular-level/#respond Tue, 22 Sep 2020 13:58:28 +0000 https://techreport.com/?p=3473441 Air Pro Home

Air purifiers have become a common household device to reduce pet air and particulate matter in the home. Now that pure air seems to be on everyone’s minds for greater...

The post Molekule Air Pro: Taking Out Pollutants at the Molecular Level appeared first on The Tech Report.

]]>
Air Pro Home

Air Pro Home

Air purifiers have become a common household device to reduce pet air and particulate matter in the home. Now that pure air seems to be on everyone’s minds for greater health concerns, more consumers are turning to air purifiers to provide a cleaner home environment.

As one of the air purifiers the covers the most area (up to 1,000 square feet), the Molekule Air Pro offers a unique technology that sets it apart from other air purifiers on the market. Priced at $1,100, it’s also one of the most expensive air purifiers on the market. However, after trying it out for a few weeks, it is an investment that pays a healthy return in the form of clean, fresh air.

Molekule Air Purifier Technology

Molekule has developed its own air purification technology that goes far beyond that of other air purifiers on the market. Air purifiers typically all work the same way. The device draws air in that contains pollutants. The pollutants are collected on a filter or series of filters. Then, the air leaves the device without those pollutants.

Molekule’s technology goes one step further to ensure you are rid of pollutants and particulate matter. It developed technology that it refers to as (PECO Photo Electrochemical Oxidation). The company’s website explains that this technology captures and destroys pollutants while they are still in the air that passes through the device.

Then, using a filter with a coated material that starts a chemical reaction, it breaks pollutants down to the molecular level as they pass through the air purifier. These pollutants even include allergens, bacteria, and viruses.

 

Other Unique Features

There are two unique features to the Air Pro not found on the other Molekule air purifier models or on other air purifiers I’ve tried. The first is the particulate matter sensor, which is made to detect three different levels of particulate matter. The sensor will show you four different colors to indicate the level of particulate levels in the room.  Knowing this can help you determine what fan speed you may need to use to improve the air.

The other differentiating feature on the Air Pro is the auto-protect mode. There are two of these modes, including standard and quiet. This feature can automatically adjust the fan speed based on what the sensor reports related to particulate matter. It is intended to optimize air purification, and it seemed to do just that.

More Features

The Molekule Air Pro has six fan speeds. It has a fairly low noise level for being so large, ranging from 33 to 64 dBa, depending on the fan speed you are using.

The Molekule App

Molekule gives you an app  that you can download and use on your mobile devices. With it, you can control the air purifier from anywhere, monitor air particulate levels and filter life, and sign-up for filter auto-refills. If you get more than one Molekule air purifier, the app will manage all of them regardless of the model.

 

Auto-Refill Filter Program

Molekule offers an auto-refill program. The filter is actually two filters in one (Pre-Filter and PECO-Filter). The refill pack costs you $99 for a six–month supply, which is about the time period they last before needing replacement.

What I Like

There are many things to like about the Molekule Air Pro. It delivers such clean and fresh air. It covers a large area, is easy to set-up and use, and it has a valuable and convenient app. The screen and app give me a clear picture of the state of the air in my home.

What Could Be Better

The Molekule Air Pro is a big investment. It does pay off in terms of very clean area in a large space. The good thing is that Molekule also offers smaller models in case you don’t need to cover such an area that are less. In researching the product, I also saw that the company provides regular coupons and deals that lower the price.

Where to Buy

The Molekule Air Pro will be available for pre-order on September 22 at Amazon, Best Buy (online), and Molekule.com for $999 at launch ($200 off regular price of $1199). The air purifier will then ship in mid-October.

Final Thoughts

The Molekule Air Pro delivers clean air for a large space. It has features that no other air purifier has and comes from a company that is now making commercial and medical grade air purifiers for hospitals and schools. This makes me feel even more confident that I have an air purifier that is cleaning the air better than any other device currently available. For those with compromised breathing, this is especially a worthwhile investment.

 

This post contains affiliate links and I will be compensated if you make a purchase after clicking on my links.

The post Molekule Air Pro: Taking Out Pollutants at the Molecular Level appeared first on The Tech Report.

]]>
https://techreport.com/review/molekule-air-pro-taking-out-pollutants-at-the-molecular-level/feed/ 0
Rachio 3: Water-Wise, Smart Sprinkler Controller https://techreport.com/review/rachio-3-water-wise-smart-sprinkler-controller/ https://techreport.com/review/rachio-3-water-wise-smart-sprinkler-controller/#comments Fri, 04 Sep 2020 15:00:47 +0000 https://techreport.com/?p=3473126

With more concern over water conservation, more homeowners are looking for a sprinkler system controller that can help them be smart with watering their lawn and yard. In recent years,...

The post Rachio 3: Water-Wise, Smart Sprinkler Controller appeared first on The Tech Report.

]]>

With more concern over water conservation, more homeowners are looking for a sprinkler system controller that can help them be smart with watering their lawn and yard. In recent years, Rachio has offered various models, the latest being the Rachio 3.

Priced at $189.99 for eight zones (which is the one I added to my sprinkler system) and $239.99 for 16 zones, the Rachio 3 is a flexible smart sprinkler controller. Here’s my review.

Features

The WaterSense–certified sprinkler controller has a Weather Intelligence Plus (WIP) service that provides weather forecasts based on local weather reports as well as satellite, radar, and atmospheric flight data. This helps guide the watering cycle for that time period.

Rachio 3 integrates with many smart home controllers and devices. These include Google Assistant, Amazon Alexa, HomeKit, IFTTT, SmartThings, Wink, Xfinity, Control4, and Nexia. This is a competitive advantage for the Rachio 3 because other sprinkler controllers don’t offer that same number of integrations.  For example, voice control with Apple’s Siri lets you turn the sprinklers on and off.

The Rachio 3’s app gives you a home screen that shows the controller status, the weather forecast, the last sprinkler run and the next one, and a graphic with water used and saved.

Watering Schedules

There two options for watering schedules: a time-based schedule or device-generated schedule. Both options offer watering restrictions and work with the Weather Intelligence Plus feature. I went with the device-generated schedule. It used the Flexible Daily option, which updates every day based on the soil moisture.

You can use smart features like the Smart Cycle. This is a cycle-and-soak feature that divides watering into multiple shorter cycles. The Rachio 3 knew when a rainstorm was coming and shut off on its own, which helped us not waste water.

Pros and Cons

The things I like about this smart watering device include ease of use, integrations with other smart home devices, features and watering cycle options, and the hyperlocal weather information. The device looks nice and is designed to fit well where another traditional sprinkler controller was placed.

The only real downside to this smart watering device is that it’s not waterproof. You can make it waterproof by purchasing a cover separately. It would have been nice to have just included the cover as a feature rather than option. Some may feel the price is steep for the smart sprinkler device, but the return in the form of a lower water bill made it worth the cost.

Where to Buy

The Rachio 3 is available on the company’s website as well as through retailers like Amazon and Costco as well as major DIY stores.

Final Thoughts on the Rachio 3

This smart water controller delivers good value and has already shown the return on investment in terms of water savings. Plus, it works with so many other smart home devices. It’s easy to install and use, helping homeowners start making a difference in how much water they use.

The app and smart device integration sets the Rachio 3 above and apart from the rest of the devices. Overall, I was pleased with the significant water savings just by installing this smart device.

 

 

 

The post Rachio 3: Water-Wise, Smart Sprinkler Controller appeared first on The Tech Report.

]]>
https://techreport.com/review/rachio-3-water-wise-smart-sprinkler-controller/feed/ 5
AKASO Brave 6 Plus Action Camera: Ready For All Types of Adventures https://techreport.com/review/akaso-brave-6-plus-action-camera-ready-for-all-types-of-adventures/ https://techreport.com/review/akaso-brave-6-plus-action-camera-ready-for-all-types-of-adventures/#comments Thu, 03 Sep 2020 15:00:39 +0000 https://techreport.com/?p=3473116 AkasoBrave6Plus

I’ve been wanting to get an action camera that would go with me and the family on all of our adventures. However, the price of most was not in the...

The post AKASO Brave 6 Plus Action Camera: Ready For All Types of Adventures appeared first on The Tech Report.

]]>
AkasoBrave6Plus

I’ve been wanting to get an action camera that would go with me and the family on all of our adventures. However, the price of most was not in the range of what I wanted to spend for a camera I would just use occasionally. I can appreciate that brands like GoPro charge what they do for all their gadgetry. But, I was looking for something at a lower price point with some bells and whistles. I found it in the AKASO Brave 6 Plus Action Camera. It’s a portable camera that can help me capture both video and photos. Here is my review of this action camera designed to work outdoors.

Features and Functionality

Although it’s a lower-priced action camera, it doesn’t seem to be budget when it comes to features and functionality. It has a two-inch LCD touchscreen, voice and remote control, WiFi capability, electronic image stabilization 2.0, 8x digital zoom, and adjustable view angle.

The action camera also features a self-timer, driving mode, time-lapse video and special effects, loop recording, motion detection, and adjustable white balance.

The waterproof case lets you dive up to 30m/98ft and provide protection from rocks, dust, and scratches.

Another feature I like is the AKASO DV app, which is free and available for download on your phone or tablet. You can review and manage the images or videos from the action camera on-the-go.

AkasoBrave6Plus

Pros and Cons 

Beyond the long list of features that make this seem like a more expensive camera than it is, the actual camera unit has a premium feel with nice finishes and overall design. The buttons and touchscreen also feel like this is a more advanced camera.

The accessories have a similar higher quality look and feel. Plus, most of these accessories are actually compatible with other action cameras, including GoPro.

Besides the quality and value, I like that it is easy to use, including all the accessories and attachments. It’s also good that it has USB-C capability.

The 4K video is amazing as is the number of ways you can mount the camera to capture the action.

If I was to suggest any improvements to this nifty little, yet powerful, camera, it would be to make the voice control feature really good and reliable or just don’t offer it. Also, on my wish list would be 4k video at 60 fps. This is an option on pricier action cameras, so I may not be entirely realistic with that request.

Although recommended, a Micro SD card is not included so you would have to purchase separately for extended memory capability.

In the Box

The AKASO Brave 6 Plus Action Camera comes with a 2.4 Ghz remote, two 1350 mAh Batteries and a charger, a waterproof case, a bicycle stand, seven different mounts, two clips, a helmet mount, a bandage, five tethers, an extra waterproof door, and a USB cable.

Where to Buy

Priced at between $100 and $120, you can find the AKASO Brave 6 Plus Action Camera on sites like Amazon.

Final Thoughts on AKASO Brave 6 Plus Action Camera

For a lower cost action camera, the waterproof AKASO Brave 6 Plus Action Camera is impressive in terms of what it can do. This is the ideal budget action camera for those seeking adventure once-in-awhile.

The post AKASO Brave 6 Plus Action Camera: Ready For All Types of Adventures appeared first on The Tech Report.

]]>
https://techreport.com/review/akaso-brave-6-plus-action-camera-ready-for-all-types-of-adventures/feed/ 1
E-WIN Champion Gaming Chair: Ergonomic Comfort Wins https://techreport.com/review/e-win-champion-gaming-chair-ergonomic-comfort-wins/ https://techreport.com/review/e-win-champion-gaming-chair-ergonomic-comfort-wins/#comments Wed, 26 Aug 2020 15:42:32 +0000 https://techreport.com/?p=3473012

As more of us work from home, we are spending more time in our office chairs. Whether it is work or play, our backs are most likely feeling the number...

The post E-WIN Champion Gaming Chair: Ergonomic Comfort Wins appeared first on The Tech Report.

]]>

As more of us work from home, we are spending more time in our office chairs. Whether it is work or play, our backs are most likely feeling the number of hours we are hunched over our screens.

Enter ergonomic chairs many of which provide comfort for long hours of sitting. However, some of these can be a major investment. That’s when I discovered that I could get a gaming chair that could double as an office chair and for those gaming breaks that are essential to greater productivity throughout the day.

The first PC gaming chair I tried was the E-WIN champion gaming chair. Priced at just under $300, it offered an appealing price compared to those fancy ergonomic chairs plus it looked cool from the pictures. Here’s my review.

About E-WIN Champion Gaming Chairs

The E-WIN Champing Gaming Chairs are ergonomic gaming chairs that come with a lumbar pillow. You can choose from color combinations that include black and white, black and red, black and day-glow green, black and turquoise, and black and pink.

Features include stain-resistant 2.0 PU leather, which you can clean with a cloth yet looks and feels like real leather. The chairs also contain cold-cure foam, which is the company’s proprietary high-density foam, which feels very similar to memory foam that you might find on a mattress. The elasticity of the foam is designed to prolong the chair’s life and comfort level.

Pros and Cons

If you have read any of my product reviews then you know that, while I love gadgets, I don’t love assembling things. Some of the other ergonomic chairs I had considered looked like you needed an engineering degree to put them together. Not this one.

All I needed was an Allen key and about 15 minutes. The pieces were organized well in the box and mostly pre-assembled. Everything was labeled so I felt confident about what I was doing.

Once assembled, I found many other things to like about the gaming chair. It is designed and made with quality materials that don’t feel in the least bit cheap. The stitching and leather-like finish shows that the gaming chair is made to last. It feels like a much more expensive ergonomic chair.

The comfort level can be adjusted to fit the person sitting in the chair. It has a hydraulic height-adjustment lever like most chairs. But, it also has a reclining lever, which lets you put the chair almost to a horizontal position.

I love the armrest that adjust as well, including rotating, sliding to the left or the right, raising up and down, and pushing them forward or backward. You can even take the armrests off the chair.

In terms of cons, there is only one minor point. The lumbar and neck pillows are a bit awkward in terms of placement and feel. However, it’s not enough to make the whole chair uncomfortable.

In the Box

The box contains the chair components, including pillows. It also has a pair of gloves and instruction manual. The chair comes with a 10-year limited warranty.

Where to Buy

The E-WIN gaming chairs are available on the E-Win website as well as through Amazon. E-Win offers free shipping and returns. Use code TECHREPORT for 20% off.

Final Thoughts

Overall, this ergonomic gaming chair is a great buy. It’s comfortable, sturdy, and easy to assemble. Everything is adjustable for even greater comfort. The quality of the materials also makes the price an even better value.

Related Post: BEYOU, The Chair That Fits You

 

The post E-WIN Champion Gaming Chair: Ergonomic Comfort Wins appeared first on The Tech Report.

]]>
https://techreport.com/review/e-win-champion-gaming-chair-ergonomic-comfort-wins/feed/ 3