Press "Enter" to skip to content

Posts published in “Technology”

CES 2024: Coding Simplified

By Kent Yang | Staff Writer

Kicking off this year is the biggest tech event of the year, CES 2024, also known as the Consumer Electronics Show. CES is an annual trade show typically held at the Las Vegas Convention Center, featuring the latest advancements in consumer technology. It all began in 1967 in New York City when organizers held the first CES. Among the 200 exhibitors were notable attractions such as pocket radios and TVs with integrated circuits, which were groundbreaking at the time and contributed to drawing in over 17,000 attendees. Since then, the CES has continued to serve as a global stage for innovation. read more

Larian Studios’ “Baldur’s Gate 3” Takes Over the 2023 Game Awards

By Kent Yang | Staff Writer

Image Copyright Larian Studios

The Game Awards, initiated in 2014 by veteran game journalist Geoff Keighley, stand as an annual ceremony honoring both creative and technical accomplishments within the gaming industry. Since its inception, the awards has gained immense popularity, evolving into the most-watched awards ceremony in entertainment. Last year’s Game Awards ceremony gathered a staggering 103 million streams. To put this into perspective, the Oscars, a longstanding and renowned awards ceremony, grappled to attain 20 million television viewers. read more

A Year in Tech

Top Tech Trends and Innovations of 2023

By Kent Yang | Staff Writer

Within the dynamic world of technology, several noteworthy trends and innovations emerged in 2023, shaping the foundation of the next great digital era. 

Quantum computing took center stage with IBM’s groundbreaking development of a new quantum computer capable of executing specific calculations millions of times faster than its predecessors. 

Meanwhile, artificial intelligence (AI) experienced a breakout year for generative AI, deep learning platforms, and advancements in autonomous robots and vehicles. Tech giants like IBM, Apple, Intel, and NVIDIA are in a fierce race to create the best hardware that leverages AI technologies. read more

A World Where You Define: Intel 14th Gen Processors

By Kent Yang | Staff Writer

Intel 14th Generation Processors, code name is Raptor Lake-S is a consumer class processor built on Intel 7, previously known as 10NM Enhanced Super Fin. Keep in mind all transistor densities aren’t equal, Intel’s 10NM node is similar to TSMC’s 7NM node. Intel 7 boasts a range of impressive features such as Xtreme Tuning Utility and more.

(XTU) with AI assistance simplifies overclocking with the click of a button. By analyzing the CPU voltages, motherboard power settings, thermals, and other settings, the AI assistance then formulates the best overclock settings for your PC, giving you an extra boost of performance for free! In comparison with Intel’s 13th Generation, Intel claims it is up to 18% better at multi-threading and up to 23% in gaming performance against the leading competitor, which would be AMD’s Ryzen 7950X3D.  read more

AMD Resurrects Sought-After HEDT Processors

By Kent Yang | Staff Writer

It’s no secret that AMD listens to their customers, and they have yet again. With the revival of High-End Desktop processors from AMD, the Ryzen ThreadRipper targets prosumers and enthusiasts while ThreadRipper Pro 7000 is aimed at enterprise customers. Both ThreadRipper series are built upon TSMC 5NM and are for the sTR5 socket. 

The AMD Ryzen ThreadRipper 7000 includes an 8 Core Complex Die Layout on the TRX50 Platform. A Core Complex Die or CCD is a cluster of 8 CPU cores that share access to a common L3 cache. A test conducted by AMD shows the 7980X is between 4% to 94% better than competitor’s Intel Xeon W9-3495X depending on what application is being tested. They were made available on November 21st, 2023. The AMD Ryzen ThreadRipper 7000 series consist of the following: read more

Industry Leading Power Efficiency Showcased at Apple Event

By Kent Yang | Staff Writer

After boasting that hard work isn’t hard at all when you’re on a Mac, Apple introduced their new lineup of ARM based processors: the M3 Family. Ever since switching from Intel Processors to ARM Processors a few years ago, Apple’s aim has been to have a long battery life while keeping performance quality consistent while both plugged in and unplugged. Not only have they succeeded in this endeavor, they’ve also become an industry leader when it comes to power efficiency. Notably, the M3 family is also the first chip to use a 3-nanometer lithography. The M3 Family are listed below: read more

The Brains of the Digital World, Central Processing Units (CPUs)

By Kent Yang | Observer Staff Writer

Have you ever purchased a brand-new smartphone, laptop, or desktop and thought, “Wow! It’s so fast!” The main reason for this, in conjunction with other components such as memory, graphics card, and solid-state storage, for day-to-day applications, would be because of the processor.

What is a Central Processing Unit or CPU? Think of it like the brain of the human body but instead of being for a body, it’s for a computer. The CPU is a small but powerful component that processes and performs calculations from the instructions it receives. CPUs are sometimes referred to as “chips,” and this is due to it being a main type of logic chip. read more

Cybersecurity Awareness Month

By Kent Yang | Observer Staff Writer

Last month marked the 20th anniversary of Cybersecurity Awareness Month. This campaign was created in 2004 by the Department of Homeland Security and the National Cybersecurity Alliance to ensure every American has the resources needed to stay safe and secure online. This year, the focus of Cybersecurity Awareness Month was on four critical cybersecurity practices.

The first of these practices includes enabling Multi-Factor Authentication (MFA), a tool which adds another layer of protection to your account by requiring a security token or code to verify your login. This token is usually sent via text, email, or through an app like Google authentication, depending on the user’s selection or preference at the time of setting the MFA up. read more

A Double-Edged Sword

Balancing the Benefits and Ethical Dilemmas of AI

By Kent Yang | Observer Contributor

Artificial Intelligence and applications of AI, such as Machine Learning, have ushered in a new era of technological advancement, transforming industries and enhancing our daily lives. However, this enhancement also comes with an ethical double-edged sword. While AI offers tremendous benefits, it also raises ethical concerns that demand thoughtful consideration.

AI technologies offer the potential to revolutionize the workplace by automating tasks, thereby enhancing efficiency, productivity, and reducing on-the-job injuries. However, this advancement also brings forth concerns related to job displacement. According to a report from the World Economic Forum, by the year 2025, AI may lead to the displacement of approximately 85 million jobs. But it’s also expected to create 97 million new roles as well, which perfectly illustrates the double-edged nature of AI’s impact on employment. read more

Is It “Bye-Bye” for WiFi?

By Daniela Perez
Observer Contributor

When it comes to submitting online assignments, having reliable access to the internet is essential for college students.

Many areas on the Gardner campus have little to no cellular signal and in most parts of the school the WiFi can be slow or may not connect at all.

One of the places on campus that rarely has any signal is the basement of the school where Media Arts and Technology labs are held and where MRT students work on projects. The MRT computers are always offline and students have a difficult time connecting to the internet through their own laptops or cellphones. read more