Covering Scientific & Technical AI | Tuesday, December 3, 2024

Digital Manufacturing at HPCC ‘12 

<img style="float: left;" src="http://media2.hpcwire.com/dmr/Accio_Energy_Panels.jpg" alt="" width="102" height="94" />Newport, Rhode Island in late March is the setting for the annual National High Performance Computing and Communications Conference (HPCC-USA). The conference traditionally hasn’t focused on manufacturing. But that seems to have changed noticeably this year.

Newport, Rhode Island in late March is the setting for the annual National High Performance Computing and Communications Conference (HPCC-USA). Not familiar with the conference? You’re not alone. HPCC-USA traditionally hasn’t focused on manufacturing. But that seems to have changed noticeably this year.

HPCC-USA is a conference that I’ve attended for the last four years although it’s been held since 1987. It started as a traditional high performance computing conference in an intimate setting, oriented around the computing needs of the Department of Defense (DoD). It’s traditionally attracted the scientific users of the DoD laboratories, whose computing needs have been served by the High Performance Computing Modernization Program (HPCMP).  But these needs have historically not had much to do with manufacturing.

You can imagine that much has changed since HPCC-USA’s founding in 1987.  The conference has changed also.  Although the theme of this year’s 26th edition was Supercomputing: A Global Perspective, a second theme on the use of HPC in Digital Manufacturing crept into the agenda. I wanted to blog on this second theme and point out some common threads that we saw in last week’s talks.

But first, full disclosure: I was part of the campaign to slip this second theme into the conference. But I had the willing conspiracy of the conference co-chairs, John West and John Miguel. John West is the new leader of the HPCMP which sponsors the CREATE program (Computational Research and Engineering Acquisition Tools and Environments). CREATE seeks to put these digital manufacturing tools into the hands of those who make the decisions about our military purchases, showing increased attention on manufacturing in the DoD.
John West led off the conference by addressing “What’s missing from HPC?” He touched upon several barriers that hold back HPC from becoming a ubiquitous tool in manufacturing, including the lack of a HPC literate workforce. Doug Post, the leader of CREATE, described in more detail, some lessons-learned from the program. And both spoke about extrapolating the HPCMP’s view of the barriers to adoption by the broader manufacturing community.

I share many of their views and was pleased to help them organize this second theme. Seeking case studies of real, small businesses that have augmented their design process with HPC tools, I was looking for companies who could explain their workflow in fashion that could be extrapolated to other small businesses.

We know that the business case for making investments in these tools is very important. It’s not enough for those of us who’ve been imbedded in the HPC community to simply assert that digital manufacturing is good for your business. I wanted to hear from actual companies about what it meant to them, to their profitability, to their development of new products and getting them to market.  

The stage was set by Gardner Carrick of the National Manufacturing Institute, one of the leading organizations assisting members of the National Association of Manufacturers with new technologies, and was continued by a “Rapid-Fire” panel session run by Addison Snell of Intersect360.

However, I’ll concentrate on three talks from three very different small businesses that have made this transition.

Steve Legensky, the CEO of Intelligent Light [1], spoke about their work for Zipp Speed Weaponry, supporting Zipp’s introduction of a new bicycle-racing wheel. This wheel has transformed the international racing scene (and the pocketbooks of amateur riders, like me). You can read more detail about Intelligent Light and Zipp in the December 9 edition of DMR [2]

Steve expanded on the story and told us more about how it was done. Zipp’s business return was not in the form of reduced costs from computing, but in the production of a qualitatively better product than the competition, something they could not have done by experimental measurement alone. These wheels handle better because Zipp now knows how to control the airflow about the wheel to prevent buffeting by crosswinds. This is not lost on the amateur racing market, which has snapped these wheels up.

Steve told us that the required size of these design simulations might be larger than we had thought.  The limits of experimental measurement are more easily factored into design margins than the limits of computation. Reduction of design uncertainty is helped by very high fidelity simulation, but at the cost of long run times and large computers. Increasing a designer’s confidence that HPC has accurately defined his margins may actually increase the amount of computation beyond what we might have expected.

Steve also described the engineer who carried out the simulations and his collaboration with the technical leaders at Zipp. Matt Godo is a “renaissance scientist,” one of those gifted workers who not only understand the software, but how to use it on a high performance computer. He is also a racer, so he understands what it means and how to communicate with the experts at Zipp. Their success was dependent upon engineers with enthusiasm, who already possessed the knowledge to act upon the enthusiasm. We do not have enough engineers who have this broad palette of knowledge and we are not producing them in sufficient numbers to meet the demands of companies.

Dawn White of Accio Energy [3] is co-founder of a small start-up in Michigan that is bringing a unique renewable energy product to market. Imagine a product, vaguely reminiscent of a window screen in size and shape that distributes a charged water mist from its grid and collects energy from the voltage gradient as wind carries the mist downstream. Ingenious, huh?  Dawn pointed out that this is not a new idea, but was originated in the 1970s at a time when experiment was the only way to develop a new concept. The possible design space here is very large. Searching it is prohibitively expensive to pursue with prototype builds and measurement.

No one has successfully treaded this way before. The goal is obviously to produce a device that generates significantly more electrical energy than that with it must be seeded. However computation is the only way for a small company to explore this design space. Dawn and her engineers have now found a configuration that not only promises to provide net power, but also at a level beyond the ratios required for an economically viable design. We’ve seen this multiple times; the exploration of a design space by computation much more rapidly and inexpensively than by prototype build.

But Dawn also described her company’s engineers as highly computer literate, knowledgeable of how it is applied, and whom she was fortunate to find. We see this theme again. The renaissance scientists who combine these skills are rare and sought after.

Frank Ding works for Simpson Strong Tie [4], a maker of anchors, connectors and fastening systems for home construction. The rapid design of new fastener systems to meet changing standards has the same time-to-market pressures that drive other small businesses. Simpson Strong Tie is a leader in adopting computation as a means to explore new designs and to virtually test them, having first made the move to high performance computing in the mid 2000’s. Frank described several new systems, designed by computing, but their designs for concrete anchors told a compelling story. Laboratory design of concrete test fixtures requires curing of the concrete. This takes as much as thirty days. Finite element analysis on the computer reduces this cycle to a few days and allows the exploration and test of prototypes much more rapidly. It is the improvement in several processes like this that have lead Simpson to successively build out their computing expertise.

Frank spoke of one more issue that I see echoed in Steve’s and Dawn’s talks, the support of senior managers to consider changes in the manufacturing workflow … if it makes sense.  The three speakers showed various views of the reasons why computing has become so important to their workflow. While each is clearly grounded in the business realities for making these technology investments, they show aggressiveness in seeking out and adopting the technology.

There is no magic compunction to simply adopt computing for its own sake. Each speaker could show the varying reasons for the shift through an analysis of their return on investment. We knew this, but it was a clear, common theme in these talks and throughout the conference. It reinforces the fact that a sober business case evaluation is even more important in the decision to introduce HPC, than is appeal of the technology itself. Techies like me have to keep this in mind.

The three case studies stand upon the expertise of computationally literate engineers, whom each company was fortunate to find. These are three more examples showing that our current HPC workforce is found, but not consciously made. We simply must work on the workforce problem to provide American manufacturers with the qualified people that will allow stories like these three to become the norm.

-----

References:

[1] http://www.ilight.com/

[2] http://www.digitalmanufacturingreport.com/dmr/2011-12-09/bicycle_racing_on_the_computer_modeling_and_simulation_for_a_small_business.html

[3] http://www.accioenergy.com/

[4] http://www.strongtie.com/

The illustration depicts Accio Energy's concept for blade-free wind energy generation.

AIwire