Exercise E: Making Data Meaningful

Can the Numbers Speak for Themselves?: Managing Writing Style, Audience Expectations, and Context to Make AEV R&D Data Meaningful

 

Background

Engineering is founded on developing procedures and processes that generate data, which is then collected through a variety of means (software, equipment and instruments, paper and pen, photographs and sketches, etc.). We in STEM value the reliable and valid production of data, and depend on it to make decisions needed to develop innovations and maintain existing engineered systems and products.

While you may have heard the adage “the numbers speak for themselves,” professional engineers understand that data, particularly “raw” data, should always be interpreted in context. The meaning one engineer understand from a data set may not align with what another engineer–even an engineer from the same discipline–sees. This is because contextual factors influence how we make meaning from data. Our interpretations of data are influenced by our prior experiences with similar data, our understanding of the limitations of the data collection method or experimental design, or our knowledge of how the client wishes the data to be used.

What does this mean for writing and speaking about data? Simply that we must precisely and directly articulate what the data mean, as well as acknowledge the key contextual factors that contribute to that meaning and affect our interpretations and comparisons.

Practice

  1. You have now run some preliminary tests of your AEV and collected data on its performance criteria. Now, write an email directed to the company CEO, explaining precisely and directly the results of your tests. Explain what conclusions you have made from these data (and how you came to those conclusion) as well as decisions for next steps in AEV development. Conclusions may range from insights about the efficacy of your code, the choice of braking system, the need to balance the weight distribution, or any other criterion on which your group has chosen to focus. You may use graphs of your AEV energy expenditure tests to support your conclusions and illustrate key points.
  2. With your team, present the results of your tests and your conclusions to another team, using your energy expenditure graphs as visual aids.
  3. Once both teams have presented, discuss your results together and determine the key differences between your designs and tests results.

Your email should not only describe what your conclusions and decisions are, but why/how you came to those conclusions and decisions. For example, “Based on an average stopping error of 30.5 cm per trial, we concluded that coasting is not a precise enough method to reduce the AEV’s speed…”

Considerations:

  • You may express raw data results (e.g., the AEV sample expended 120 joules to travel the course, while our AEV#1 expended 100 joules), but a comparison of the raw numbers is not useful for understanding the significance of your improvement. How you can process and interpret the data to make it more meaningful—and make the benefit to your innovation clearer—to the CEO (e.g., “AEV#1 was 16.67% more efficient than the AEV sample”)?
  • If you refer to your energy expenditure graphs to help support your explanation to the CEO, consider that graphs can help us visualize the raw data in a way that can make our interpretations easier to understand, but also they require interpretation. Explain your graphs of the energy expenditure data in terms of direct comparison among prototypes, using precise units, but also carefully consider which units of comparison are most useful for highlighting your most significant innovations to the initial model (e.g., perhaps your AEV is 15g lighter than the prototype, but the most significant difference is that your use of the motor reduces energy usage by 56% (100 joules)).

Discussion:

  • How does listening to another team’s presentation of tests results influence your team’s ideas and conclusions about your design? Did it make you rethink any aspects of your design?
  • Did the other team process and explain their data similarly to how your team did? What kind of comparisons best highlighted the AEV innovations that a team had made?
  • Were there any instances where the presenting team could have interpreted or ‘processed’ the data to make it more meaningful or compelling? If they used them, were the graphs effectively used to support their claims? How so?

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

Fundamentals of Engineering Technical Communications Copyright © by Leah Wahlin is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.

Share This Book