title Were you wanting a quick way to generate an as-built of a site, or a reproduction of an existing object to get accurate 3D model data? Project Photofly is the answer.

What is it?

Photofly is a cloud based service that takes your photographs of an object, any object, and maps them to a scene using  photogrammetry and Point Clouds. Once mapped, various things like Polyline CAD models can be exported and used for design purposes.

image

Hesitant, but hopeful

This project is a bittersweet realization. I have some experience with photogrammetry, some great, some not so good. I have innumerable hours of time in Photomodeler preparing Construction and Manufacturing projects.  I have seen what powerful apps can do, and what it takes to make them work.

I started this when tired of driving out to a bridge with a ladder and cloth tape to get odd bridge measurements for a company.  While we were contracted to gather support data, no one understood the scope of what was required. I can’t pull my crews numerous times a week for little things, so I’d just drive out there. Eventually, I took a load of pictures, and got Photomodeler.  The last measurements they requested were calculated on the model in my Dell workstation. I never went back to the site until construction had already commenced. Awesome tool.

Well, while some projects were great successes, some were failures. This was always as a result of limited access, and not being able to return to get the follow up photographs needed.  I spent weeks on one project pushed on by the fact that I had spent so much time already and continued to try to use other people’s photos. Lessons learned.

How does this technology work

Applications like Photomodeler work by using known factors about the photographs such as focal length and user identified common points, and then calculates the camera positions. Once the camera positions are determined, models of the object are as easy as tracing over the points in the cloud. The cloud may be as few as a couple hundred points, and upwards to the limit of the processor and RAM.

image

Why am I hoping for something better?

Photofly is different than other apps I have used because it lets you submit you photos by uploading them to a server, and then sends you back a scene.  That’s crazy!!!  The scene is built for you on a cloud.  The servers stitch the camera positions and photographs together by itself. The scene is then served back to you, and from there additional definitions can be applied, and subsequently models can be exported.

image

I spent days upon days checking redundancies and minimizing calculated point closures. If  it is really this simple it will revolutionize the industry.

You have got to give this a try.

Tips

I will be giving this a whirl. I’ll get back to you on what I find, but until then, here are some basic tips I have learned along the way.

  • Shoot as many off angle shots as possible – Face on shots are nice, but shots taken at 45° angles are better to coordinate with others.  My suggestion is to take face on shots in conjunction with as many angled shots as needed.
  • Use ONE FOCAL LENGTH when possible.  Some apps allow you to save the focal length of camera stations, and apply them as needed.  My experience with nailing down a bridge to sub centimeter accuracy was enabled by having an exacting camera focal definition.  Photofly may well use meta data in the photographs to determine some focal length data. The app never asks for the information. Until I can report more, I suggest taking your test images with one length, or like my preference, all the way in or all the way out.  Don’t use odd zooms until we see how Photofly reacts.
  • Shoot A LOT of photos – Don’t be shy. Reposition yourself slightly, and reshoot the previous shot.  You’ll be surprised at how a small change and a lot of overlap will help you later on.
  • Don’t forget VERTICAL – If you are needing an accurate model, then you need just a good degree of vertical angle difference in your shots. Remember the computer can’t make scenes from images that you did not shoot. If you have only rough shots of a rooftop, then that area will only be calculated roughly
  • Use positive control when possible – For the bridge job, I learned to set targets throughout the job. These targets could be seen and picked at any angle of photograph. I made sure that at least one target could be seen in every photo, and having two in many delivers a positive advantage.  This will permit a control network around the subject that is closed up tight.  Subsequent searches for which calculated point is throwing out your balance is no longer an issue. The one that says it is having trouble, is the one that is the trouble.

What I’ll be doing

I have submitted a project and am waiting for it’s return. Once I get it, I’ll be asking for some of Autodesk’s input on specific features. Check back to see how it all goes.

I sent the better half of a failed project in.  It is possible that is will come back vague, but hey, the point cloud technology may just solve some things that I could not.  I was using photos from other people, and the objects were lit differently, guessed focal lengths, and in some cases models with slight alterations.

First step is to send all my personal consistent photographs in, and see what comes of it. Then I’ll try to add in the odd stuff, and see how it goes. I have no idea how the Photofly will react.

I am kind of excited to see though.