Self-driving is the buzzword in transport today; who will come up with the first mass-produced autonomous car, maybe deliveries are made by self-driving fleets of vehicles, will we even need our cars in 10 years or will we merely hail a driverless ride via an app?
Nobody knows where it’s heading, but one thing is for sure, ADAS systems are on the rise. Whether it’s keeping you safe by reading the road ahead or for full autonomy, the level of hardware and software installed in new cars with this focus is increasing year on year. However, one of the major factors holding these systems back is the cost.

Currently, most proposed solutions work on LiDAR; using pulses of laser light which reflect and bounce back off of the world around it to a sensor. The difference in time and wavelengths from different surfaces enable the system to map the world around it in 3D.
Primarily LiDAR is used in archaeology, geography and seismology, however, in recent years the tech has been miniaturised and is now one of the favourites for a self-driving future.
It’s not without downfalls though; high costs are the first hurdle along with the bulkiness of the kit. The next problem they face is that they can be affected by heavy rain, low cloud or when the sun is particularly high. Another is the powerful laser beams used in the system; these can affect the human eye, so not ideal in a built-up area.
However, you don’t need LiDAR at all, even Mr Musk says so. With the quality of cameras today Tesla is proving that self-driving cars don’t need fancy hardware, but driving in certain conditions, like fog can still pose a considerable problem. This is where we step in.
Exeros have developed an algorithm called SeeTrue that can actually see through fog, and it does so without the need for any sort of infrared projection. All you need is a camera and SeeTrue.

The algorithm can either be embedded into new cameras or added post image via server-side integration, this way it works as an in-line filter for existing cameras.
It doesn’t just work on fog either; the system will let users see clearly through haze, mist, rain and even smoke. Our software works by differentiating direct, scattered or diffracted light, it then strips off the layer of weather leaving just the raw footage with no loss of resolution.
We believe that SeeTrue technology will advance autonomous driving by at least half a decade, and make driving in fog a possibility for affordable ADAS.