Oxford University spinout invents body scanner for accurate clothing measurements

Response to:
The online shopping experience has long been plagued by inaccurate measurements and unsatisfactory returns. Imagine if you were able to avoid this hassle? Thanks to an invention by a Oxford University tech spinout, that is now a possibility.

The AI tool scans the user’s body to provide accurate and precise measurements for clothing. This cutting-edge technology, initially founded in 2019 by Duncan McKay–an INSEAD MBA and Phil Torr, a Professor of Computer Vision and Deep Learning at Oxford University, is set to revolutionize the online shopping experience.

The tool is designed to aggregate data from millions of users which will help build an accurate model of the body. This allows users to identify key body measurements and find items of clothing that perfectly fit their body. The implication of this technology is immense; it has the potential to save UK retailers billions in returns.

The scanner works by combining two distinct technologies; a 3D scanner and an AI-supported algorithm. The 3D scanner uses structured light technology to accurately measure body dimensions and create an avatar of the user’s body, while the AI-supported algorithm is responsible for analysis of these measurements. All of this data is used to create a personalized avatar of the user’s body.

In the long run, this technology is set to revolutionize the online shopping experience and make it easier and more efficient. The tool also promises to bring significant cost savings to UK retailers, who are estimated to incur losses of £30 billion annually due to incorrect clothing measurements matched to dimensions.

At the moment, the technology is still in its early stages of development but it is expected to be adopted more widely in the near future. It remains to be seen how this invention will fare in the retail industry but its potential is unquestionably immense.


Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *