After all the discussion about PLM 360 data laying the foundation for all of the ‘Next Generation’ tools at Autodesk, It seemed like a good time to look at how this might work. (I said ‘might’).

The Current Proposition

Autodesk PLM 360

Source: EMA Design Automation

The last post I wrote discussed having all the data, models included, on the PLM 360 cloud, or as I like to shorten it, “up on top”. Summarized, this concept has all the data, however segmented, in the company’s PLM 360 space. As new designs are created, the model data is stored along with the requirements, specifications, task orders, quotes, and so on. All are maintained according to whatever rules are associated with their respective workspaces and lifecycles, etc.

Hopefully you can see how convenient that could be. However what is more significant in my mind is the ability to simply manage all the data, model included, as data. As time and technology advance, these will become more analogous, making physical separation of data types by workflow less useful.

Rendering Visualizations

Cloud Computing

Source: ZDNet

I remember the first time I heard about this technology. I was not quite happy at the prospect. Here’s the basic skinny:

In order to perform math intensive, polygon intensive calculations with resources left over for the user, their computer requires specialized and / or expensive hardware. Some examples are multiple processors, streamlined graphics cards with high core count GPU’s, RAM out the wazoo, and more. Those that could not afford these trinkets, just couldn’t play with the big dogs.

The same thing happened in the gaming community. As a result, online game manufacturers realized they had excluded an entire, and vastly numerable portion of the population. They also noticed that since everyone’s data had to go ‘up on top’ through the same pipe, and return the same way, there was a certain commonality with the data, the resulting calculations, and how it was distributed.

So they decided that since their collective online game servers were vastly more capable than the players’, and capable of rendering and distributing all the data for all involved, why not simply calculate all the player interactions ‘up on top’. Instead of sending the collective changes in the scene to 50 computers with varying capabilities and hardware, the game servers would compile the changes, render the video superbly, and send that exact same image to everyone simultaneously. Only two factors stood in the way:

  • The bandwidth and throughput of the internet connections (including user’s hardware)
  • The very hurt feelings of those that had invested so much money to play these graphically intense games.

The cost of a new router and NIC is substantially cheaper and easier to replace than RAM, CPU, and Graphics Cards. The new system was the great equalizer, and more importantly, an enabler for so many.

That very same concept is the same that is being used to make this PLM 360 ‘everything up on top’ concept a reality.

  • You don’t have to have 8 core CPU’s
  • You don’t have to have extremely expensive graphics capabilities

All you have to do is have enough processor and RAM to quickly take the streamed data from the cloud, and get it to the users eyes.

Local Data Perception

image Shake it off. You and I do not touch our data. Not our emails, not our models. We touch a keyboard, a mouse, and a 3DConnexion Space Pilot Pro (privileged to do so). Everything else is remote.

One of the recurring themes in my chats with Oleg Shilovitsky is perception of the user. There is no difference between the server in your office that you are wirelessly connected to, than the server sitting in someone else’s office that represents the ‘cloud’, that you are also wirelessly connected to.

Additionally I’d like to take security out of the equation. I bet the physical and technological security guarding the cloud data is substantially better than that which is guarding your office server data. Do you have an armed guard patrolling your office at night? I bet not. How about better hardware security than SonicWall ? Probably not in the S&B market that I represent.

With that out of the way, the only difference in where the servers sit is the transmission time / distance. Technology is transforming and the lag time is growing smaller by the day. How long does it take to get a Google search result on a fast clean OS and PC setup? Not much time at all.

Functionality

Buzz Kross Introducing Autodesk PLM 360

Source: Autodesk, Inc.

Instead of telling your exchange server to send the email message with attached data from another server partition (that was most likely copied by drag and drop to your email client), you simply instruct the system that has all the data ‘up on top’ to do so. The difference here is that the PLM 360 system will already know which version of the data you want issued and knows who is involved in the project. Moreover it can tell the team exactly when it was sent, what was sent, who sent it, and for what purpose. This is all part of the fact that the model data is merely a function of the entire data set.

Future Technology

Software vendors leveraging cloud technologies are looking to the future for the answers to the same questions we are all asking. How are you going to pull this transformation off?

Autodesk is focusing on how to develop modern cloud-based PLM technology. It includes the assumption of modern web-based architecture, distributed locations, etc. We are planning to avoid the strategy of saying “cloud is just another location for our PLM servers”. Oleg Shilovitsky, Sr. Director, PLM and Data Management, Autodesk Inc.

NVIDIA GeForce Cloud Computing

Source: OnLiveSpot

My thoughts are biased towards the capabilities that exist today, and have done so for some time. (I’m not privy to the newest developments and covert research that exists in these spaces). While many of us could argue the limitations that the current technologies have, we all know these limitations will have new names and descriptions in 5 years. In a very short period of time, the performance that the cloud can deliver will have overcome the best hardware and software sitting in almost and S&M businesses today. Once the ‘up on top’ platform(s) goes hot, the barriers that keep the majority of companies out of the S&M design space will no longer include software and hardware.

Comment From NVIDIA

The following is a passage from a great article on the future of hardware technology in online gaming and data services, and how that technology is being changed radically and changing the user experience. Check it out:

Currently there are so many different form factors like iOS devices, Android devices, Smart TVs and others which are incompatible with the general software development on PCs. The ability to stream to any of these devices is becoming more and more important for software developers, and the ability to reach all these devices is extended a lot by cloud gaming…” Phil Eisler, the general manager of cloud gaming at NVIDIA.

Source: OnLiveSpot