The latest Investing Matters Podcast episode featuring financial educator and author Jared Dillian has been released. Listen here.
Maps,
Another one that I am sure will appear in the Telematics section is "Ford Pro" we are already in the workhorse F-150, the new Ford Transit is about to drop soon in ICE and BEV forms, the existing Ford "Pro" vehicles such as the current Transit already have a great telematics back end and add on features for individual drivers and fleet managers. Adding in Seeing Machines DMS and sending the alerts through the telematics is a no brainer, even if we don't get the monitoring fees but we do get the DMS sale.
Don't forget the BEST SELLING VEHICLE in the UK last year wasn't a car, it was the Ford Transit
Keep your eyes shut while driving - that seems to be the only option. Glance at an advert and the case will look up the details and spam your car with links to the advertiser. Is this what anyone wants?
JC, a great patent with some familiar names from TRI.
Uses eye gaze maps and looks at the scene and using past experience predicts what you should be looking at. For example, glancing at the adjacent car you might dwell on the number plate and scan the rear quickly. As a new object comes into view on the road ahead you should give it a glance. But if you are still not noticing the stationary car in the road it will notify you.
So the result is similar to what Veoneer are doing, but there will be differences in how the implementation is done.
Of course, this needs to be built on rock solid and continuous eye tracking.
Interesting JC. Note the diagrams have the same Toyota AV vehicle with roof mounted sensors seen in other recent patents.
Key part here is they need a good eye tracking DMS. But I'm not sure that many people will use an App to tell their car what medication they just took. The examples even included Red Bull!
The part the car needs to know is how many vodkas were in there too!
I think we may get a few mentions. We will soon be in every BMW model, we helped Euro Cap write the rules and Colin is speaking...
https://www.carhs.de/en/safetyupdate-program.html
Thursday, May 19, 2022
Safety in Automation Level 2
09:00 Level 2: A Safety System or a safe System?
Dominik Schuster - BMW Group
Classification of assisted and automated driving functions
Safety-centered development of level 2 systems
Safety requirements for level 2 systems
What are the safety potentials of level 2 systems and how can those be leveraged?
Effectiveness evaluations of level 2 systems
09:45 Euro NCAP Occupant Status Monitoring
Adriano Palao - Euro NCAP
Euro NCAP Occupant Status Monitoring assessment protocol
General requirements
Noise variables
Detection of Driver State (Distraction, Fatigue, Unresponsive Driver)
Vehicle response
Assessment Approach: Dossier + Spot Testing
Future of Euro NCAP Occupant Status Monitoring
10:15 Why Driver Monitoring is the critical Safety Function
Colin Barnden - Semicast Research Ltd
The third era of automotive safety technology
Defining "supervised automation"
Infotainment screens and safety
Why do we need driver monitoring?
What is "raw data" gathering for DMS?
The road from driver to occupant monitoring
How does "Human Factors Science" relate to DMS?
Distraction, drowsiness and impairment
Automaker ranking for advanced DMS technology
Leading DMS technology suppliers and key competency indicators
10:35 Driver Engagement
Dr. Florian Raisch - BMW Group
Driver Engagement Driving SAE L2
5 Generic Rules to Provide a Correct System Understanding
Difference of a Hands-On and a Hands-Off System
Appropriate Driver Monitoring
11:00 Coffee Break
11:30 Criticality Assessment of Driving Situations for ADAS Level 2
Dr. Andreas Kuhn - Andata Entwicklungstechnologie GmbH
12:00 IIHS Safeguards for Partial Automation
Dr. Jessica S. Jermakian - Insurance Institute for Highway Safety
12:30 Euro NCAP Automated Driving
Dr.-Ing. Patrick Seiniger - BASt - German Federal Highway Research Institute
Assisted Driving Grading (Level 2)
First ideas for Automated Driving assessment (Level 3)
13:00 Lunch Break
This is very niche JC not sure where it would be used. If you needed a telescope to provide enough long distance vision, to drive, then you won't have enough peripheral vision while you use the telescope. So this patent is to turn on the ADAS up to 11 while you glance through the telescope lenses attached at the top of your glasses above the usual line of sight.
In addition it can describe what you see (and can't see while in telescope mode that may be relevant) and even display what you are looking at on a screen.
So even if this is planned not for supporting those without enough vision but also some potential trend for augmenting your regular vision while driving, I can't see the market supporting it
Phil
"....At this time, the details of the optical technology cannot be revealed due to multi-year development timeframes...."
They know how long it takes to develop these, but why give the competition the chance to start today. I have a very confident feeling that we developed this concept in 2019. Putting it into automotive scale production takes time and it is the manufacturing that that was the risk.
We knew and publicised that we would be saving some cash for purchasing technology. For me this is another de-risk moment. We are signed up to "a world-wide perpetual exclusive licence which applies to both automotive and aviation market sectors" it is in the bag, our bag!
Now there are many ways to skin a cat, give competitors a road map with all the Toll roads (patents and licenses) but not the dead ends they can possibly find a way to the destination, but it will be an expensive 3 year drive with the meter running. They will have to take the muddy trails and have to reverse back and climb the mountains, it is less efficient and they will probably run out of cash, and by then Seeing Machines will have moved the destination on again with another phase of technology
As NumptiPi says, spending A$5m (over 3 years) is a big commitment for us and we have delayed as long as is prudent. Terry is correct - absolute steal when amortised over 10, 20+ years of automotive DMS/OMS, and we know that our Automotive income is starting, so years 2 and 3 payments won't be a problem.
For the seat hopping future of autonomous vehicles, your screen and content will move seats to follow the passenger as they swap seats.
Sounds like hell, If you can't turn it off it will follow you. This means they can ensure you watch all of the adverts!
JC, this is a good example of how a good UI needs more than accurate eye gaze. My phone has a cure atev (accurate) touch but doesn't always get my intention, sometimes because I get distracted or sometimes the phone is active in my pocket. Here is a good example of using alternative modes to confirm attention. In addition it is using state as well as gaze. "careless state" may sound better in Japanese, but it is a clear indication that using eyes for selecting is prone to accidental hijacking by the brain wanting to look outside and "through" the HUD rather than at it.
Thanks JC, I better answer these before breakfast, or CFP won't let me get my bonus!
What can I say, another attempt to make eye contact for a video call, but not with me but a simulacrum assembled from multiple images and with my eye gaze staring into the camera despite me actually driving a car. Quite a horrifying thought really and definitely making me look like I am undead or Tom Hanks in Polar Express.
A side note, two cameras at the top of the windscreen either side of the sun visor, a third in the middle of the passenger's A-pillar, and a fourth in the console area.
So despite the mention of eye gaze, this is far from what Seeing Machines would want to be involved in. It hampers safety and would require more processing than we need to do DMS and OMS and still the effect is just a gimic that people will not want to see.
Perhaps in the conference room scenario it makes more sense for remote viewers to get the attention but not something I would look forward too.
It is bloody hard work running a multi-billion dollar company while pretending to be a minnow. Don't know about the other guys, but when this all comes good. I will be spending my shares on Coke and Maseratis, the rest I will waste
Interesting,
GM GLOBAL TECH OPERATIONS LLC - 2022-04-14Fri 07:57
VEHICLE BEHAVIORAL MONITORING
Working out of the surrounding drivers are paying attention based on the movement of their vehicles!
Another layer on top of DMS.
It is like a virtual tour guide.
Given the planned route and the user's profile it will direct you to look at particular objects and know if you have spotted them, if not it can direct your gaze then it will recognise that you are tracking the object with your gaze as the vehicle moves
Interesting find JC
Re: BOSCH GMBH ROBERT [DE] - 2022-04-28Sun 10:28
SYSTEMS AND METHODS FOR DETECTING SYMPTOMS OF OCCUPANT ILLNESS
DavidW, this is not infringing on SEE's core territory.
This is actually a pretty neat or scary/Orewelian idea. Have a look at the photos in the patent and read the claims. This isn't really aimed at cars, but public transport where cameras/radar watch and/or listen to the passengers to see the signs of illness. Then the display (for driver/guard/security to be shown the potentially ill patient with the actions highlighted on the video - cough, sneeze, wretching, writhing. Only a small step from recording so called anti-social activity which is what worries me.
There is no reason for SEE to be linked with this
Seems to provide no function than see if the driver stops being sleepy or clumsy if you tell the not to.
No worry for us
Numpti,
Eyeware only do PC software and it is rather hard to hide the 3d gaze trackers with their limited range and refresh rate, that they support in a ****pit environment. Only really acceptable in mock-ups with large monitors not representative ****pits
It doesn't take long to see past their demonstrations and spot the gaps. They do automotive too - now where will they install the PC?
"buy indicator" has spoken
Phil, are Toyota developing their own DMS? Well, the diagrams of the Toyota patents may all be similar, but the group of names on this patent are the stalwarts of TRI that we know we have worked with for a long time . But as S2030 says, it is "developing" a DMS they are "probably" developing on top of our DMS.
This patent for example is about how to deploy the DMS - it turns out (sorry for the pun) but steering wheels move and block the camera. This novel implementation uses multiple fibre optics to take the light from individual lenses on the rim of the wheel facing the driver to a individual camera in the steering wheel of steering column. The theory being that at least one of the cameras that serves the top of the steering wheel will have a clear view of the face.
Of course this is completely impractical and expensive, not to mention that he lenses will be dirty, smeared or scratched and the image down a fibre will not be as clear as from a direct camera to lens arrangement.
This is the category of patent that exists as a road block, they have no intention of developing a solution to use it, but if every road is blocked this may be the least bad solution and they can reap the license fees from the fool that succeeds in making it work
Re: INTERFACE SHARPNESS DISTRACTION MITIGATION METHOD AND SYSTEM
JC, as you probably noticed this is from the same group at Toyota in the US and uses similar diagrams (car with a LIDAR and sensor roof rack)
It explains the basics of Eye tracking and dynamically blurs the infotainment screens if the driver and passenger aren't looking. When it sees them move the focus area it ensures the display is crisp when they look at it - the reason is to conserve valuable "brain juice" - it doesn't explain that the effect is probably lost by the world whizzing past full of advertising hoardings and other vehicles outside of the car!
"[0035] Provided the teachings in the paper by Hubel et al., the inventors have concluded that when sharp lines are present, the brain is engaged more, as opposed to when softer edges/lines are present. In other words, the brain uses more metabolic resources (i.e., blood, oxygenation, etc.) when processing sharp lines and edges versus softer lines and edges because detecting sharp lines and edges is a specialized function of a subset of neurons. Hence, when sharp lines are present in the peripheral view for task-irrelevant information, metabolic and thus, cognitive resources are being consumed unnecessarily. Such a circumstance is an inefficient use of metabolic resources, which should instead be allocated to task-supporting neural regions. An aspect is to soften image regions with edges and lines outside and away from the focus vision area. Blurred/softened lines do not lead to firing of the subset of neurons in the human visual cortex. An aspect is to gradually change from sharp image regions/lines to softened image regions/lines as the person shifts focus away from the focus vision area in an image. An aspect is to quickly sharpen image regions/lines in an image in anticipation of a person shifting their head to a new focus vision area. By softening image regions/lines outside of the focus vision area of a driver's vision, the cognitive resources freed up by blurring or softening image regions/lines of task-irrelevant images in the periphery can be reallocated to other brain regions necessary for supporting the driver, to use cognitive resources efficiently while driving a vehicle, especially when performing mission-critical tasks. An abrupt change from softer to sharper image regions/lines may help to draw attention in order to reduce time in focusing on a display image so that the driver can revert back to focusing on the task at hand. Conversely, a gradual, inconspicuous change from sharp to softer image regions/lines may minimize—or potentially eliminate—the drawing of attention to task-irrelevant information in the periphery so that the driver can continue focusing on the task at hand. Although the term “soften”, e.g., softened, softening, softens, softer, is used in this disclosure, it should be understood that the term “smooth”, e.g., smoothed, smoothing, smooths, smoother, could be used as well ..."
Re: COGNITIVE TUNNELING MITIGATION DEVICE FOR DRIVING
Hmm, Sometimes Toyota impress me, other times they disappoint, Here is the second claim in the patent:
2. The driver monitor system of claim 1, further including a first machine learning device,
wherein the audio-video device outputs a verification request of whether or not the driver feels drowsy or is focused on a point other than a driving task, and receives a response to the verification request, and
wherein the eye gaze direction, the eye lid position, the heart rate variability, and the response are fed back to the machine learning device which learns to predict whether the driver is transitioning into a cognitive tunneling state or a fatigue state.
So it asks you if you are tired or distracted, then learns to detect that state - I hope the driver tells the truth and doesn't just swear at the car!
JC, I rest my case.