![2d to 3d conversion image processing 2d to 3d conversion image processing](https://cdn2.sculpteo.com/blog/wp-content/uploads/2019/08/smoothie-3d-min.png)
of Computer Science and Engineering Alvas Institute of Engineering and Technology (AIET) Line(delhi.x, delhi.y, delhi.z, mumbai.x, mumbai.y, mumbai.Automatic Learning based 2D-to-3D Image Conversionĭept. Vertex(0, 0, 0, 0, 0) //texture uv coordinates are the last two numbers
![2d to 3d conversion image processing 2d to 3d conversion image processing](https://d3i71xaburhd42.cloudfront.net/8fff4be6aa819865f3a90889aa87414742fad185/1-FigureI-1.png)
Translate(-img.width/2, 0, -img.height/2) So my PVector coordinates are mapped correctly. This is so the image lies flat on the XZ plane and *translate the transformation point one last time by half the widthĪnd half the height of the image. Scale(0.6) //you can apply a scaling factor here to suit the size of your img RotateY(mouseX/200.0) //alllow us to pan round the map by moving the mouse RotateX(-PI/4) //apply a rotation about the x axis to tilt the plane we will be drawing to Translate(width/2, height/2, 0) //translate the transformation point to the centre of the screen Anyway here's the code.ĭelhi = new PVector(427, 0, 234) //my coordinates are nearly the same as yours but note that I've swapped the Y and Z values. Just want to check you are comfortable with the basics first. I haven't implemented the bezier curves describing the flight path. I'm using a map image I found on wikipedia and using the coordinates you supplied the line ends up in the wrong place but it could work for you using the img you've already chosen to base the coordinates on. The approach below is less generalised but perhaps a little easier to understand. I've deviated from the previous contributors post because I think you were getting confused by the normalised vectors.
#2d to 3d conversion image processing code#
If you're done with Kennedy Airport go on to LAXĪs promised, here's some code that achieves the basics of what you are looking for. If you use some scaling-methods onto the mesh you'll have to scale HVector and VVector by the same value. If you use a matrix to rotate the earth-mesh now, draw Kennedy Airport whithin that rotation-block (before calling popMatrix or pushing another one in) or use the same rotation and translation values as for the earth-mesh. Result (X,Y,Z) is the position of Kennedy-Airport as on the untransformed and unrotated earth-mesh. Z = TLC.Z + HVector.Z * KAP.H + VVector.Z * KAP.V Y = TLC.Y + HVector.Y * KAP.H + VVector.Y * KAP.V X = TLC.X + HVector.X * KAP.H + VVector.X * KAP.V Now find Kennedy-Airport locations map-coordinates on the mesh-surface like this: The position of the TLC (TopLeftCorner) we have also. Now we got the directions and length of the horizontal and vertical borders. Since mesh and texture are fortunately both rectangular we don't need to worry that much and can use a simple formula to create two vectors which point horizontal and vertical relative to the mesh surface:
![2d to 3d conversion image processing 2d to 3d conversion image processing](https://vip.bu.edu/files/2012/06/Block_diagram_stereopairs.jpg)
The top left corner-vertex (TLC) of the earth-mesh represents H 0.0, V 0.0 the bottom right corner-vertex (BRC) represents H 1.0, V 1.0 of the maps image. Don't care where exactly, but we know Kennedy Airport is located at 6.25% of the width and at 25.67% of the height. They lay somewhere in space at some X,Y and Z position. You have the corners of the mesh: those vertex-thingamagickies. (I use H and V for horizontal/vertical to not mess them up with coordinates in world-space) Store them in imaginary forum-variables: KAP.H and KAP.V Now Kennedy Airport Pixel is around Position 0.0625, 0.2567, The textures-size has a width of 400 and a height of 300 pixels.ĭivide 25 by 400 and 77 by 300 to get values from 0.0 to 1.0, Maybe Kennedy Airport is located at Pixel 25, 77 of your texture Therefore calculate the positions like that: You might store your 22 airports as PVector, fine.īut they're not points in the Processing-Window: they are in first place points somewhere on the surface of the mesh which displays the earth-texture - so they are just points on the map of the earth.Īs first you should normalize the XY- coordinates so they all have values from 0.0 to 1.0.