As we know about the current situation there is no such technique that can replace or remove the need for Pen/Chalk for use at a useful place like University, Classroom, ResarchLabs, Tutions, etc.

But I have come up with a new Idea for replacing the need for Pen/Chalk at these places with the help of Augmented Reality(AR) by making an App AR Pen that helps the user to create or draw in the mid-air without the use of any Pen or Chalk for drawing shapes that follows a human’s finger with the help of AR.

It will help us to preserve the environment and also help us to save the trees as we know the pages of notebooks are made from Tress, and the ink of the pen also uses natural resources, this app will help us to reduce the environmental causes.

Suppose you are teaching students in the class and you want your students to understand better the concept(suppose cube) that is a 2D figure and they are not able to understand it because it becomes really difficult to understand a 3D figure in a 2D plane. But with the help of this app, you will be able to explain the concept in a much more detailed manner as we can draw the 3D figure in the mid-air with the help of this app.

What it does

Let me first explain the approach of drawing shapes that follows a human’s finger with the help of AR. Drawing shapes is done by detecting every new location for the moving finger, dropping a vertex at that location, and connecting each vertex with the previous one. Vertices can be connected by a simple line, or through a Bezier curve if we need a smooth output.

Modeling in Augmented Reality (AR) lets users create and manipulate virtual objects in mid-air that are aligned to their real environment. I present ARPen, a bimanual input technique for AR modeling that combines a standard smartphone with a 3D-printed pen. Users sketch with the pen in mid-air, while holding their smartphone in the other hand to see the virtual pen traces in the live camera image. ARPen combines 3D input precision with the rich interactive capabilities of the smartphone touchscreen. I studied subjective preferences for this bimanual input technique, such as how people hold the smartphone while drawing, and analyzed the performance of different bimanual techniques for selecting and drawing. Users preferred a bimanual technique casting a ray through the fingertip for both selection and translation. I provide initial design guidelines for this new class of bimanual AR modeling systems.

How I built it

My application is built using Unity Engine, and Google AR Core.

Challenges I ran into

As there is no such kind of app available in the market for solving this problem, I had to research a lot about making such a kind of app and building it successfully.

Accomplishments that I'm proud of

Solving all the problems and being able to make such kind of unique app. I've successfully made a solution and an application that is large scalable accounting for all the problems which haven’t been made previously.

What I've learned

Taking user input and being able to create a figure in the mid-air with the help of AR.

Built With

Share this project: