AI is learning to take your words less literally

Artificial intelligence

Image comes from article. Source at bottom of this blog.

In a report from Honolulu, Hawaii, researchers are now teaching artificial intelligence systems how to take things “less literally.” This means helping the machines make “small talk” and teaching them understand human’s conversational phrases, tones and more.

According to the article, researchers are teaching artificial intelligence systems to understand the meanings of words not based on their dictionary definitions, but on their conversational meaning.

At a recent AI convention, AI developers unveiled a new type of AI that reads what a person is really saying, teaching it to really listen to what the human wants from it. Another developer created an AI that can “distinguish between literal and figurative phrases in writing,” according to the article.

According to a researcher at Carnegie Mellon University, one important thing AI researchers are trying to implement in how facial tones change the meaning of words. This researcher for example compared how the meaning of the word “sick” changes depending on if someone is frowning and says it with a monotone voice as compared to someone who says a movie is “sick” smiling and talking about it enthusiastically.

The Carnegie Mellon researchers have taught an AI system to learn the meaning of human conversation by having the system watch YouTube videos to learn how facial expression can affect the meaning in language.

According to the article, many researchers all over the country are now studying this new form of helping AI understand language meaning that is always changing as humans develop new definitions for old words. There is new research on AI watching videos and looking at writing to differentiate between idioms, catch phrases and more.

Overall, I think this is a really cool experiment that researchers have been conducting because I know that all over the country language changes meaning and the same word can mean different things depending on where you say it. I think this could help AI become more realistic and knowledgeable in the future instead of just giving definitions from a large database. I would love to try this out and see if AI systems can determine the meaning of what I am saying and whether I mean it figuratively or literally. It is an interesting concept that will surely further the growing realm of AI research and how it can help people in the future.

Sources:

https://www.sciencenews.org/article/artificial-intelligence-learning-not-be-so-literal

 

Verizon, Samsung announce team up to release a 5G smartphone during the first half of 2019-Caitlyn Frolo, Week 4-Post 2

 

In an article by USA Today, Samsung and Verizon are teaming up to promise cellphone buyers that 5G will be available before summer 2019. Samsung said it plans to release another 5G compatible smartphone in this time period. However, the 5G network is not launched yet, but these companies are working hard to get it up and running.

Verizon currently has a 5G network up an running in some areas. These areas are being used for testing before it goes live all over the country. These areas include Houston, Indianapolis, Los Angeles and Sacramento, according to the article.

Samsung says it will create this smartphone by connecting it through the “Snapdragon Mobile Platform, which is made by Qualcomm. The article claims Samsung typically releases a new Galaxy S every Spring, which hints that the newest may be 5G compatible.

It is currently a race between Apple, Sprint, Samsung, Verizon and the phone companies to be the first to use 5G as a new form of connection and in 2019 we will very likely see that happen.

So what is 5G?

According to an article by Verizon about their 5G network, 5G will help enable advances in virtual reality, artificial intelligence and robotics.

Verizon also promises that the personal computer and internet will be incorporated through the digital smartphone technology.

The website says it will run it on a higher frequency than the current 4G connection and it will come with reduced lagging times.

Overall, I hope we are able to see 5G in the next year or so and it will be interesting to see how other companies follow in Verizon and Samsung’s footsteps! I think it is also cool to see competing companies working together to make advancements for everyone!

 

 

Sources:

https://www.usatoday.com/story/tech/talkingtech/2018/12/03/verizon-samsung-partner-5-g-smartphone-first-half-2019/2191131002/

https://www.verizonwireless.com/5g/

Image Source: https://www.google.com/url?sa=i&rct=j&q=&esrc=s&source=images&cd=&cad=rja&uact=8&ved=2ahUKEwjLiIOUp5bgAhURo4MKHWCjDhoQjRx6BAgBEAU&url=https%3A%2F%2Fwww.tubefilter.com%2F2018%2F02%2F21%2F5g-duopoly-facebook-google-cut-all-cords%2F&psig=AOvVaw2RP5kAtmP6OiMN1fpSPYik&ust=1548965195129094

Samsung “Bots” hint at the future of human care- Frolo, Week 3, Post 1

This image comes from the article where I found the information. See below for a works cited.

A Samsung press conference this January served the purpose of unveiling a new line of robots and exoskeletons created by Samsung to the public. The models show three new robotic machines being tested. These new robots were named Bot Air, Bot Care, and Bot Retail.

According to demos at the conference, Bot Care is being advertised as a robot able to do household chores, amount other things. The name Bot Care comes from the idea that this robot will be able to take care of elderly people within the comfort of their own homes.

The robot is designed as a cone shape, with a “head” screen on top. The head contains many sensors that allow it to move freely within a home, without running into couches or tables. The robot is also able to take heart rate and blood pressure measurements as well as use a sleep monitoring software that will help a buyer understand the quality of their own sleep.

According to the article, Bot Care will remind buyers to take their medicine and share their measurements with an app called Samsung Health. The device will also have the capability to link to Internet of Things devices. Samsung marketed this with the ability for elderly buyers to link their fall risk software with the Bot Care, letting the device know when someone has fallen and coming to their aid by calling 911 and notifying family.

Bot Air is being marketed as an air quality robot that will detect areas in a buyer’s home that are low in air quality. The bot will then move to that area and turn on an air purifier within it.

According to the article, Bot Retail is the most common technology many companies are making to help with shopping, food delivery and directions and locations.

Samsung also unveiled devices that are able to be strapped onto body parts to help buyer’s walk, correct posture and mobile stability.

The devices, called the Gait Enhancing Motivational System (GEMS) will be available in different levels of need.

The conference ended with leaving those in attendance with questions about the new robotics and devices.

It is also important to note that nothing was mentioned about whether or nit these devices would ever be sold to the public or what their price range would be.

These robots are an interesting concept in terms of connecting them to Internet of Things devices. However, the price range and availability of these robots not being answered, does raise and important question in terms of how necessary and wanted these robots will be in the future.

I like the concept of the Bot Care device which could be beneficial in helping the elderly market and help create less worry for families and friends of those who need care.

Overall, these devices are an interesting concept and I am excited to see more companies release technology like this in the future to be tested for consumer use.

 

 

 

 

Works Cited:

https://www.engadget.com/2019/01/08/samsung-care-bot-exoskeleton-hands-on/

AI Can Find Genetic Syndromes With Pictures – Caitlyn Frolo Week 1, Post 1

 

Photo Source: Taken from article website (see source below)

A new study can apparently identify genetic syndromes more accurately than doctors in a new AI experiment. The program suggests that by only looking at an image of a person could help diagnos rare disorders faster than doctors may be able to. According to Karen Grip, a medical geneticist working on the program, in adding the AI software, workflow can increase and patients have a better chance at being diagnosed faster.

The software was created due to doctors inability to learn every genetic disorder because there are hundreds, with many being rare. Diagnosing patients with these disorders is also an incredibly long process. Through this observation, scientists created DeepGestalt, the software in mention. After using over 150,000 patients to teach the AI, it was put to the test.

DeepGestalt works by looking for “landmarks” on a patients face. It then zooms in on face features, like the nose or mouth,  and crops them into small pixel sections. After this ist analyzes these zoomed in sections for genetic syndromes and their probability. Through further experiementation, DeepGestalt was better at recognizing syndromes than doctors. According to this report, this AI was able to diagnose patients with 97% accuracy.

In conclusion, this AI software is very interesting in that it could help physicians and health care professionals in diagnosing rare genetic syndromes in a shorter time period. This software could also be beneficial in decresing the cost of doctor visits for patients, because less appointments would be needed with this software. DeepGestalt may also help people to become more educated in rare genetic disorders and be able to look for signs by themselves. Overall, this is an interesting advancement in the medical field that can help patients and doctors for years to come.

Source: http://blogs.discovermagazine.com/d-brief/2019/01/08/ai-diagnoses-genetic-syndromes-pictures-face/#.XDtwjC3MxmA