For the last couple years, I have periodically heard the term “deepfake videos”, but prior to completing the research for this post, I didn’t know much about them. In fact, my knowledge of deepfake videos was limited to a few key facts that I’d heard repeated on the news and the internet: the average person can’t tell the difference between a real video and a deepfake video, anyone with a computer can make one, they will soon be everywhere, and they will definitely destabilize democracy.
Zigbee is an open source spec for Wireless Personal Area Network (WPAN) communication that allows Zigbee-certified products to connect and communicate using the same IoT language. Companies that utilize and support Zigbee are part of the Zigbee Alliance. To date, the Zigbee Alliance contains over 500 companies, all of which work together to create and utilize Zigbee in their product design. Some of the most noticeable companies in the Zigbee Alliance include Comcast, Honeywell, IKEA, Legrand, Samsung SmartThings, and Amazon.
Nginx is something that most software developers have used at some point in their careers. However, there is a difference between working with Nginx and really understanding it. Nginx offers many advanced features that most people are not aware of and, therefore, not utilizing.
In this post, I’ll go over some of the unique features that Nginx offers and how they can help you take your web application to the next level.
As developers and designers, we are creating new things every day. I like to say that we are really good at making the impossible possible. In fact, some of us are so good at it, that we actually do it unintentionally. These unintentional outcomes that occur when we are creating code are called “impossible states.”
“Smart” technology is quickly emerging in all areas of our lives. From smartphones to smart televisions, refrigerators, watches, and even dog collars, it seems like everything around us is being connected to the internet. This phenomenon is known as the Internet of Things (IoT).
We are not strangers to the unprecedented ways that new technological devices can reshape society. In the last decade, we have witnessed how things like smart phones and social media have dramatically altered how we, as humans, interact.
One of the major technological advances that will likely continue to shape our human interactions is brain computer interface (BCI) technology. In this post, I am going to delve into the history of the BCI and look at some of the current developments happening in the realm of BCI technology.
It may seem strange to bring up textiles when discussing computer programming. However, my interest in the correlation between the two was piqued last week when my friend sent me a question currently circulating on the internet: Is it possible to knit DOOM? Thinking about this question led me to consider the immense influence that the textile industry has had on computer science and modern technology.
Creating a successful application isn’t just about ensuring that all of the components work; the layout and design of the application are also crucial. The design must be professional and engaging, and the layout should be easy for users to navigate. Design components, such as animations and navigation transitions, can also enhance the usability of the application.
When you think about the future of artificial intelligence (AI) technology, it’s likely that you don’t think of auto-completion. However, you probably should. In July 2020, OpenAI released a beta testing version of GPT-3, a new auto-completion program that could very likely define the next decade of AI programming.