Jacqueline Chen, Ph.D.
Sandia National Laboratories
Good evening. I’d like to thank the members of the Society of Women Engineers for this great honor. I am truly humbled as I look across the room and see so many talented women engineers and supporters who are contributing enormously to science and technology to make the world a better place for all of us. I have to say it’s been extremely rewarding to pursue a career in computational science and engineering. While I’ve been driven by the science of turbulent combustion, most gratifying has been the opportunity to work with a diverse group of people — young and old, women and men, individuals from around the world, and scientists from vastly differing disciplines (computer science, applied mathematics, fluid dynamics, and chemistry). I’ve relished every minute of it — the tending of long computer runs after midnight and taking conference calls at odd hours in faraway time zones to work with collaborators. Many of these colleagues — whom I’ve had heated debates with about our research — have wound up lifelong friends and continuing collaborators. My sincere thanks to all of you.
I am also grateful to Sandia National Laboratories, where I’ve spent my entire professional career, and especially to the leadership who had the foresight to establish the Combustion Research Facility (CRF) at Sandia during the midst of the oil crisis in the Carter administration. The CRF is a DOE Office of Science collaborative research facility hosting over 100 researchers each year who work alongside Sandia staff researchers to solve important combustion problems. I met many of my collaborators through technical exchanges at the CRF.
I’d especially like to thank my nominator, Chris Shaddix from Sandia; Janet Williams of the Sandia Women’s Action Committee (SWAN); those who took the time to write letters of support for my nomination; and the selection committee for its careful deliberations.
This award would not have been possible without my research team, past and present — the postdocs, collaborators, and students over the past 30 years. Many have gone on to have wonderful careers in academia, national laboratories, and industry. I truly appreciate their friendship, hard work, and their shared joy in scientific discovery.
I also want to thank my first program manager at the DOE Basic Energy Sciences Gas Phase Chemical Physics Program, Bill Kirchhoff, now retired, who early on in the ’90s recognized the importance of high performance computing (HPC) in chemical sciences, and took a chance on a young fluids researcher by providing sustained funding and encouragement to pursue fundamental investigations into “turbulencechemistry” interactions through computation.
I want to acknowledge the long-term computational support I’ve received from the Oak Ridge Leadership Computing Facility, to Doug Kothe, Jack Wells, and Ramanan Sankaran, who have been partners in refactoring software, many, many times, to keep up with and run on some of the world’s largest supercomputers. I want to acknowledge the DOE Advanced Scientific Computing Research Office for their trust in me to lead interdisciplinary teams of computer scientists and applied mathematicians to develop combustion codes for the exascale machines that are coming in just a few years.
Lastly, I want to acknowledge the enduring love and support of my family, my husband, Paul, who has infinite patience and has been there every step of the way as an equal partner in raising our two children, Zachary and Maya. Zachary is working for a start-up in San Francisco, and Maya is a sophomore at MIT, pursuing a career in engineering.
Inspiration For Engineering
During my formative years in Athens, Ohio, in the ’60s and ’70s in the Vietnam era, I was raised by first-generation immigrants from China. Both of my parents came to this country impoverished, in hopes of pursuing a graduate education and establishing a better life for their future family. They believed America offered boundless opportunities for those that were willing to work hard. My father eventually became a professor in mechanical engineering at Ohio University and my mother stayed home and raised my sister and me in a small college town. Early in our lives, our parents exposed us to piano lessons, weekend Mandarin lessons taught by the handful of Chinese moms living in Athens, and they encouraged us to participate in school activities, including science fairs. My father was a soft-spoken, quiet man who took great pride in his work. I believe it was through watching my father construct all sorts of cam mechanisms and four-bar linkages in his basement study for hours on end while listening to operatic music on his turntable, that I became interested in engineering.
As I grew up and went off to college at Ohio State University, I realized I wanted to try a major in mechanical engineering, following my father’s pragmatic advice that engineers were always in demand. In the late ’70s, there were very few women enrolled in engineering (at most, there were one or two other women in my classes). Back then, homework problems and exams that relied on computation required adeptness with manipulating slide rules (or analog calculators) for a short time until calculators became widely available.
I was fortunate to have the mentorship of Professor Lit Su Han and his graduate students, who invited me as an undergraduate to work part time instrumenting a turbine blade with embedded thermocouples for a heat transfer experiment conducted in a subsonic wind tunnel. Watching the smoke visualization for the first time, I was amazed to see the beautiful patterns of turbulent eddies over the blade, which piqued my interest in fluid dynamics.
An integral part of my undergraduate job was learning to machine parts needed for the experiment, and as such, I was introduced to the machine shop and its supervisor, an experienced, gruff fellow, who at first welcomed me to the shop each time by handing me a broom and dust bin, suggesting that I sweep up the metal shavings on the floor. I laughed off his — perhaps neatnik, perhaps chauvinistic — gesture, as I was determined to learn to use the band saw to cut the plexiglass end plates for our turbine blade. After several unsuccessful attempts to follow the curved outline of the part (and a few broken blades), he realized that I was determined to make the part, he taught me how to use the band saw correctly, and we became good friends after that.
Upon graduating from Ohio State, I joined Sandia National Laboratories, and through the Sandia-sponsored One- Year-on-Campus (OYOC) graduate student fellowship for minorities and women, went to UC Berkeley for my master’s degree. There, I received excellent mentorship from Professor Boris Rubinsky, who taught me how to carefully freeze biological tissue (bovine tissue) so as to study its morphology as it thawed out. Many years later, Rubinsky and his medical colleagues developed a cryosurgery technique based on the notion of selectively freezing tumors guided by ultrasound and MRI imaging. I returned to Sandia after completing my master’s degree at Berkeley and worked several years performing finite element heat transfer analysis. A few years later, I decided to go back to school again to pursue a Ph.D. at Stanford University under the Sandia Doctoral Study Program and became interested in computational fluid dynamics (CFD), a relatively new tool to probe the nuances of turbulence, which I had first observed in the wind tunnel. CFD blossomed in the mid-’80s at the NASA Ames Research Center and the newly founded Stanford/ Ames Center for Turbulence Research.This turned out to be the early place to be for computational turbulence research — although we didn’t know it back then, crammed in a six-desk office at NASA Ames next to a wind tunnel, which would rattle our windows whenever it was in operation. The rise of importance in CFD was due largely to the concurrent rapid growth in HPC and the development of high-fidelity computational tools like direct numerical simulation (DNS) and large-eddy simulation (LES), which required all of the horsepower of even those relatively early supercomputers — less powerful than the processors in our laptops today.
A Symbiotic Relationship
Turbulence is ubiquitous. It is present in nature, and can lead to the bumpiness that we occasionally experience in the plane. Turbulence is also partly responsible for the spread in wildfires and tornadoes and the devastation they produce. But, if understood and tamed, turbulence, and in particular, turbulent combustion, can be engineered to provide more fuel-efficient engines than would be possible by slower and less-violent mixing. These are engines that we rely on daily for providing electricity in our homes and offices and for transporting people and goods.
Turbulent processes including combustion are hard to understand and control because of the large dynamic range of spatial and temporal scales — from the biggest eddies that are the size of the device to the smallest eddies where energy and heat are dissipated at the molecular scale, and because of the nonlinearities associated with chemistry and its interaction with turbulent mixing. For example, in combustion there are thousands of species and chemical reactions needed to accurately represent a gasoline or diesel fuel surrogate.
Owing to its multiscale, multiphysics nature, first principles direct numerical simulations (DNS) of turbulent processes including combustion require a lot of computing power, and over several decades have kept pace with Moore’s Law (doubling of transistors about every two years), leading to exponential growth in computing. Over a period of almost 40 years, computing power has gone up by 8 orders of magnitude. This has enabled much greater fidelity in the physical representation of turbulence and chemistry. When I started at NASA Ames in the ’80s, it was possible then to directly simulate only weak turbulence, and chemistry was represented by a single global reaction step. With advances in computing at the petascale (1015 flops) today, it is feasible to resolve a much broader range of turbulence scales and represent flames with over 100 species, and hence approach device relevant parameter regimes. Recently, it has also become feasible to compose computational workflows that incorporate in situ analytics and machine learning to detect, for example, anomalous physical events, such as incipient autoignition, which can be used to steer local adaptive computational mesh refinement to place more grid points where interesting events are about to occur.
Early on I realized that to utilize new HPC systems, I could not rewrite my code in isolation, as the systems were becoming more and more complex. The mantra was to collaborate or die! It was necessary to collaborate with multidisciplinary teams of computer scientists, applied mathematicians and computational scientists and engineers. To flourish required not only access to increasingly more capable hardware, but also deep physical sciences domain knowledge and computationally efficient software from the application on down to the operating system, that all had to work seamlessly together, tailored to the underlying hardware constraints. In other words, critical mass of the right folks was required focusing on a target challenge problem for a sustained period of time.
I’ve had the good fortune to lead several interdisciplinary teams of researchers with diverse expertise, and this presents both challenges and rewards. Each community’s culture and technical jargon is different and, in addition to having a common technical goal, I found that honest communication, building trust, and occasional socialization outside of work were vital to working effectively together. You had to be willing to work outside of your comfort zone and venture into someone else’s sandbox and way of looking at things. As a fluid dynamicist, it’s interesting to see just how different I view the world from computer scientists; this dichotomy is embodied, for example, in the software that we each write to accomplish the same task. I write software known as “stencil” code to solve large systems of partial differential equations (PDEs) in a programming language, Fortran, with nested loops emulating the underlying governing physical laws, the Navier-Stokes equations; on the other hand, my computer science colleague will write the same stencil code by counting floating point operations and trying to minimize the working set size for a given machine architecture, resulting in code that is undecipherable by a fluid dynamicist. These underlying differences in our cultures are simultaneously the source of great frustration at times, and the fountain of new ideas that push computational science and technology forward. For example, by developing new programming abstractions that render a code more readable by an engineer while also making the code run fast by minimizing operations and data movement in and out of a machine’s memory registers results in a code that is far more computationally efficient, enabling larger, more complex simulations to be performed — a successful outcome.
I’d like to leave you with a few parting thoughts. The quality of our lives and those of future generations and the health of our planet will depend on the technology that all of you develop. The future is bright and filled with potential well beyond the current engineering feats: e.g., autonomous vehicles that steer and drive us to our destinations; intelligent cities with network connected devices (wireless sensors, actuators, computers); robotic prostheses that enable wounded veterans to walk again; and unimaginable advances at the intersection between medicine, engineering, and computational sciences. Success in your endeavors will depend on staying the course when the course is harder than expected, taking advantage of complementary expertise and diverse viewpoints, being opportunistic, and bringing along future generations of engineers. Thanks for listening and have a great evening!