AR Cut Paste For Photoshop
Programmer Cyril diagne showed a very futuristic demo prototype: using a phone to cut out the surrounding objects and paste them into Photoshop on the computer end. Let’s Introduce AR cut paste technology.

The prototype primarily leverages the synergy of machine learning and other software: a mobile app, a local server connected to Photoshop on the computer, and a service for detecting objects and removing backgrounds from photos. An algorithm for detecting the pointing of the smartphone’s camera is also applied, which is associated with Photoshop on the computer screen and helps paste objects in the right place.
Pusher said that adjustments are still being made to this prototype, which currently has a cut delay of about 2.5 seconds and a paste delay of about 4 seconds.
Nokia 9.3 PureView Concept Video; A Decent Initial Introduction – Suggested Reading.
An AR+ML prototype that allows cutting elements from your surroundings and pasting them in an image editing software. Although only Photoshop is being handled currently, it may handle different outputs in the future.
I wrote the mobile app using expo as I wanted to try the react-native platform. A few rough edges with Android support but the dev workflow is impressively smooth.
Could become an ideal ML interaction design research platform if it could run TFLite models without ejecting.
The secret sauce here is BASNet (Qin et al, CVPR 2019) for salient object detection and background removal. The accuracy and range of this model are stunning and there are many nice use cases so I packaged it as a micro-service / docker.
Right now, latency is about ~2.5s for cut and ~4s for paste. There are tons of ways to speed up the whole flow but that weekend just went too fast. Anyway, that’s it for now. Let me know if you have any questions. Otherwise, see you next week for another AI+UX prototype!
– Cyril diagne