Saturday, February 8, 2020

Blind Engineer Invents A ‘Smart Cane’ That Uses Google Maps To Help Blind People Navigate

Today, many products have been reinvented through technology. From smart planters to smart TVs, the power of technology doesn’t surprise us anymore. While many of the newest technological creations are dedicated to entertainment, there are many which contribute to our well being, especially to those who experience a disability of some sort.
More info: Instagram | Facebook | wewalk.io
 
Recently, a revolutionizing smart cane called WeWalk has been introduced to help blind people navigate their surroundings much more efficiently when they are on their own.
The cane was invented by a visually impaired engineer Kursat Ceylan, who is the CEO and co-founder of Young Guru Academy (YGA), the Turkish non-profit behind WeWALK. Being blind himself, Ceylan knows firsthand what challenges people like him face and decided to put his knowledge into inventing something that could greatly improve people’s life.
“In these days, we are talking about flying cars, but these people have been using just a plain stick. As a blind person, when I am at the Metro station, I don’t know which is my exit… I don’t know which bus is approaching… which stores are around me. That kind of information can be provided with the WeWalk,” he told CNN.
The smart cane assists visually impaired people using smart technology, some of which we use every day.
It’s equipped with built-in speakers, a voice assistant, Google, and sensors that send vibrations to warn about obstacles above chest level.
The smart cane is available on the company’s website and runs for around $500.

Tech News: This autonomous security drone is designed to guard your home

This autonomous security drone is designed to guard your home

sunflower-bee-drone
One of the new products unveiled at CES this year is a new kind of home security system — one that includes drones to patrol your property, along with sensors designed to mimic garden light and a central processor to bring it all together.

Sunflower Labs debuted their new Sunflower Home Awareness System, which includes the eponymous Sunflowers (motion and vibration sensors that look like simple garden lights but can populate a map to show you cars, people and animals on or near your property in real time); the Bee (a fully autonomous drone that deploys and flies on its own, with cameras on board to live-stream video); and the Hive (a charging station for the Bee, which also houses the brains of the operation for crunching all the data gathered by the component parts).

Roving aerial robots keeping tabs on your property might seem a tad dystopian, and perhaps even unnecessary, when you could maybe equip your estate with multiple fixed cameras and sensors for less money and with less complexity. But Sunflower Labs thinks its security system is an evolution of more standard fare because it “learns and reacts to its surroundings,” improving over time.
The Bee is also designed basically to supplement more traditional passive monitoring, and can be deployed on demand to provide more detailed information and live views of any untoward activity detected on your property. So it’s a bit like having someone always at the ready to go check out that weird noise you heard in the night — without the risk to the brave checker-upper.

Sunflower Labs was founded in 2016, and has backing from General Catalyst, among others, with offices in both San Francisco and Zurich. The system doesn’t come cheap, which shouldn’t be a surprise, given what it promises to do on paper — it starts at $9,950 and can range up depending on your specific property’s custom needs. The company is accepting pre-orders now, with a deposit of $999 required, and intends to start delivering the first orders to customers beginning sometime in the middle of this year.

Wednesday, May 1, 2019

Tech News: Delivery robots will soon be allowed on Washington sidewalks



Washington has become the eighth state to greenlight the use of delivery robots on sidewalks and crosswalks. Governor Jay Inslee signed the bill yesterday, following support from Starship Technologies, which specializes in autonomous last-mile and local deliveries.
Proponents of delivery robots tout reduced congestion and pollution, since the robots are primarily electric, plus they give local business a competitive advantage. Companies that are otherwise lacking a delivery infrastructure can use the technology to regain an edge from the likes of Amazon.
However, others say that such devices pose a risk to pedestrians, although there's no evidence of any major safety incidents as yet. Using a combination of computer vision, GPS and machine learning, the robots are designed to autonomously map their environment, navigate crowded areas and avoid obstacles.
Virginia paved the way for delivery robots back in 2017, and Idaho, Wisconsin, Florida, Ohio, Utah and Arizona have since followed suit. Other states, such as San Francisco, are still grappling with the legislation involved. But with major global companies such as FedEx taking an increasing interest in the technology, states around the US will have to address the issue sooner rather than later.


Sunday, March 17, 2019

Tech News: Foldable phones are about to make the US very jealous - PhoneEnvy

Foldable phones are about to make the US very jealous - Phone envy

Monday, July 30, 2018

Tech News: BMW’s Alexa integration gets it right

P90269450_highRes_bmw-530e-series–inn
BMW  will in a few days start rolling out to many of its drivers support for Amazon’s Alexa voice assistant. The fact that BMW is doing this doesn’t come as a surprise, given that it has long talked about its plans to bring Alexa — and potentially other personal assistants like Cortana and the Google Assistant — to its cars. Ahead of its official launch in Germany, Austria, the U.S. and U.K. (with other countries following at a later date), I went to Munich to take a look at what using Alexa in a BMW is all about.
As Dieter May,  BMW’s senior VP for digital products told me earlier this year, the company has long held that in-car digital assistants have to be more than just an “Echo Dot in a cup holder,” meaning that they have to be deeply integrated into the experience and the rest of the technology in the car. And that’s exactly what BMW has done here — and it has done it really well.
Updatejust to clarify, the update with Alexa functionality will be available for all BMW’s that were produced after about March 2018 and that run the new BMW OS 7.0.
What maybe surprised me the most was that we’re not just talking about the voice interface here. BMW is working directly with the Alexa team at Amazon  to also integrate visual responses from Alexa. Using the tablet-like display you find above the center console of most new BMWs, the service doesn’t just read out the answer but also shows additional facts or graphs when warranted. That means Alexa in a BMW is a lot more like using an Echo Show than a Dot (though you’re obviously not going to be able to watch any videos on it).
In the demo I saw, in a 2015 BMW X5 that was specifically rigged to run Alexa ahead of the launch, the display would activate when you ask for weather information, for example, or for queries that returned information from a Wikipedia post.
What’s cool here is that the BMW team styled these responses using the same design language that also governs the company’s other in-car products. So if you see the weather forecast from Alexa, that’ll look exactly like the weather forecast from BMW’s own Connected Drive system. The only difference is the “Alexa” name at the top-left of the screen.
All of this sounds easy, but I’m sure it took a good bit of negotiation with Amazon to build a system like this, especially because there’s an important second part to this integration that’s quite unique. The queries, which you start by pushing the usual “talk” button in the car (in newer models, the Alexa wake word feature will also work), are first sent to BMW’s servers before they go to Amazon. BMW wants to keep control over the data and ensure its users’ privacy, so it added this proxy in the middle. That means there’s a bit of an extra lag in getting responses from Amazon, but the team is working hard on reducing this, and for many of the queries we tried during my demo, it was already negligible.
As the team told me, the first thing it had to build was a way to switch that can route your queries to the right service. The car, after all, already has a built-in speech recognition service that lets you set directions in the navigation system, for example. Now, it has to recognize that the speaker said “Alexa” at the beginning of the query, then route it to the Alexa service. The team also stressed that we’re talking about a very deep integration here. “We’re not just streaming everything through your smartphone or using some plug-and-play solution,” a BMW spokesperson noted.
“You get what you’d expect from BMW, a deep integration, and to do that, we use the technology we already have in the car, especially the built-in SIM card.”
One of the advantages of Alexa’s open ecosystem is its skills. Not every skill makes sense in the context of the car, and some could be outright distracting, so the team is curating a list of skills that you’ll be able to use in the car.
It’s no secret that BMW is also working with Microsoft (and many of its cloud services run on Azure). BMW argues that Alexa and Cortana have different strengths, though, with Cortana being about productivity and a connection to Office 365, for example. It’s easy to imagine a future where you could call up both Alexa and Cortana from your car — and that’s surely why BMW built its own system for routing voice commands and why it wants to have control over this process.
BMW tells me that it’ll look at how users will use the new service and tune it accordingly. Because a lot of the functionality runs in the cloud, updates are obviously easy and the team can rapidly release new features — just like any other software company.I 

Business News: Body scanning app 3DLOOK raises $1 million to measure your corpus

MG_9797
3D body scanning systems have hit the big time after years of stops and starts. Hot on the heels of Original Stitch’s Bodygram, another 3D scanner, 3DLOOK, has entered into the fray with a $1 million investment to measure bodies around the world.
The founders, Vadim Rogovskiy, Ivan Makeev, and Alex Arapovd, created 3DLOOK when they found that they could measure a human body using just a smartphone. The team found that other solutions couldn’t let them measure fits with any precision and depended on expensive hardware.
“After more than six years of building companies in the ad tech industry I wanted to build something new which was not a commodity,” said Rogovskiy. “I wanted to overcome growth obstacles and I learned that the apparel industry had mounting return problems in e-commerce. 3DLOOK’s co-founders spent over a year on pure R&D and testing new approaches and combinations of different technologies before creating SAIA (Scanning Artificial Intelligence for Apparel) in 2016.”
The team raised $400,000 to date and most recently raised a $1 million seed round to grow the company.
The team also collects “fit profiles” and is able to supply these profiles based on “geographic location, age, and gender groups.” This means that 3DLOOK can give you exact sizes based on your scanned measurements and tell you how clothes will fit on your body. They have 20,000 profiles already and are working with eight paying customers and five large enterprise systems. Lemonade Fashion and Koviem are both using the platform.
“3DLOOK is the first company that managed to build a technology that allows capturing human body measurements with just two casual photos, and plans to disrupt the market of online apparel sales, offering brands and small stores an API for desktop and SDK for mobile to gather clients measurements and build custom clothing proposals,” said Rogovskiy. “Additionally, the company collects the database of human body measurements so that brands could build better clothing for all types of body and solve fit and return problems. It will not only allow stores to sell more apparel, it will allow people get the quality apparel.”
3D scanners have gotten better and better over the years and it’s interesting to see companies being able to scan bodies just from a few photos. While these things can’t account for opinions of taste they can definitely make sure that your clothes fit before you order them.