Saturday, September 17, 2016

National Science Foundation funds chemical imaging research based on infrared thermography

The National Science Foundation (NSF) has awarded Bowling Green State University (BGSU) and Concord Consortium (CC) an exploratory grant of $300 K to investigate how chemical imaging based on infrared (IR) thermography can be used in chemistry labs to support undergraduate learning and teaching.

Chemists often rely on visually striking color changes shown by pH, redox, and other indicators to detect or track chemical changes. About six years ago, I realized that IR imaging may represent a novel class of universal indicators that, instead of using  halochromic compounds, use false color heat maps to visualize any chemical process that involves the absorption, release, or distribution of thermal energy (see my original paper published in 2011). I felt that IR thermography could one day become a powerful imaging technique for studying chemistry and biology. As the technique doesn't involve the use of any chemical substance as a detector, it could be considered as a "green" indicator.

Fig. 1: IR-based differential thermal analysis of freezing point depression
Although IR cameras are not new, inexpensive lightweight models have become available only recently. The releases of two competitively priced IR cameras for smartphones in 2014 marked an epoch of personal thermal vision. In January 2014, FLIR Systems unveiled the $349 FLIR ONE, the first camera that can be attached to an iPhone. Months later, a startup company Seek Thermal released a $199 IR camera that has an even higher resolution and can be connected to most smartphones. The race was on to make better and cheaper cameras. In January 2015, FLIR announced the second-generation FLIR ONE camera, priced at $231 in Amazon. With an educational discount, the price of an IR cameras is now comparable to what a single sensor may cost (e.g., Vernier sells an IR thermometer at $179). All these new cameras can take IR images just like taking conventional photos and record IR videos just like recording conventional videos. The manufacturers also provide application programming interfaces (APIs) for developers to blend thermal vision and computer vision in a smartphone to create interesting apps.

Fig. 2: IR-based differential thermal analysis of enzyme kinetics
Not surprisingly, many educators, including ourselves, have realized the value of IR cameras for teaching topics such as thermal radiation and heat transfer that are naturally supported by IR imaging. Applications in other fields such as chemistry, however, seem less obvious and remain underexplored, even though almost every chemistry reaction or phase transition absorbs or releases heat. The NSF project will focus on showing how IR imaging can become an extraordinary tool for chemical education. The project aims to develop seven curriculum units based on the use of IR imaging to support, accelerate, and expand inquiry-based learning for a wide range of chemistry concepts. The units will employ the predict-observe-explain (POE) cycle to scaffold inquiry in laboratory activities based on IR imaging. To demonstrate the versatility and generality of this approach, the units will cover a range of topics, such as thermodynamics, heat transfer, phase change, colligative properties (Figure 1), and enzyme kinetics (Figure 2).

The research will focus on finding robust evidence of learning due to IR imaging, with the goal to identify underlying cognitive mechanisms and recommend effective strategies for using IR imaging in chemistry education. This study will be conducted for a diverse student population at BGSU, Boston College, Bradley University, Owens Community College, Parkland College, St. John Fisher College, and SUNY Geneseo.

Partial support for this work was provided by the National Science Foundation's Improving Undergraduate STEM Education (IUSE) program under Award No. 1626228. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Friday, September 16, 2016

Infrared Street View selected as a finalist in Department of Energy's JUMP competition

JUMP is an online crowdsourcing community hosted by five national laboratories of the US Department of Energy (DOE) and some of the top private companies in the buildings sector. The goal is to broaden the pool of people from whom DOE seeks ideas and to move these ideas to the marketplace faster.

In July, the National Renewable Energy Laboratory (NREL) and CLEAResult launched a Call for Innovation to leverage crowdsourcing to solicit new ideas for saving energy in homes based on smartphone technologies. Modern smartphones are packed with a variety of sensors capable of detecting all kinds of things about their surroundings. Smartphones can determine whether people are home, or close to home, which may be useful for managing their HVAC systems and controlling lighting and appliances. Smartphones can also gather and analyze data to inform homeowners and improve residential energy efficiency.

Infrared images of houses
We responded to the call with a proposal to develop a smartphone app that can be used to create an infrared version of Google's Street View, which we call Infrared Street View. NREL notified us this week that the proposal has been selected as a finalist of the competition and invited us to pitch the idea at the CLEAResult Energy Forum in Austin, TX next month.

The app will integrate smartphone-based infrared imaging (e.g., FLIR ONE) and Google Map, along with built-in sensors of the smartphone such as the GPS sensor and the accelerometer, to create thermal views of streets at night in the winter in order to reveal possible thermal anomalies in neighborhoods and bring awareness of energy efficiency to people. These infrared images may even have business values. For example, they may provide information about the conditions of the windows of a building that may be useful to companies interested in marketing new windows.

The app will be based on the SDK of FLIR ONE and the Google Map API, backed by a program running in the cloud to collect, process, and serve data. The latest FLIR ONE model now costs $249 and works with common Android and iOS devices, making it possible for us to implement this idea. A virtual reality mode will also be added to enhance the visual effect. So this could be an exciting IR+VR+AR (augmented reality) project.

You may be wondering who would be interested in using the app to create the infrared street views. After all, the success of the project depends on the participation of a large number of people. But we are not Google and we do not have the resources to hire a lot of people to do the job. Our plan is to work with schools. We have a current project in which we work with teachers to promote infrared imaging as a novel way to teach thermal energy and heat transfer in classrooms. This is an area in science education that every school covers. Many teachers -- after seeing an infrared camera in action -- are convinced that infrared imaging is the ultimate way to teach thermal science. If this project is used as a capstone activity in thermal science, it is possible that we can reach and motivate thousands of students who would help make this crowdsourcing project a success.

Those who know earlier efforts may consider this initiative a new round to advance the idea. The main new things are: 1) our plan is based on crowdsourcing with potentially a large number of students who are equipped with smartphone-based IR cameras, not a few drive-by trucks with cameras that homeowners have no idea about; 2) the concerns of privacy and legality should be mitigated as students only scan their own houses and neighbors with permissions from their parents and neighbors and only publish their images in the Google Map app when permitted by their parents and neighbors; and, most importantly, 3) unlike the previous projects that do not put people first, our project starts with the education of children and has a better chance to convince adults.

Thursday, September 8, 2016

Modeling Charlottesville High School's solar project using Energy3D

Schools have plenty of roof space that can be turned into small power plants to provide electricity to students. Many schools have already taken actions. Some teachers even use the subject matter in their teaching. But in most cases, students are not profoundly involved in solarizing their own schools.

Fig. 1: Google Map 3D vs. Energy3D
Sure, students are not professional engineers and adults may not trust them when making serious investments in solar energy. But there is a safe way to let them try: Computer simulation allows students to model and design solar panel arrays for their schools without incurring any cost, risk, or injury.

Fig. 2: 88-panel arrays on CHS's roof.
There have been scores of software programs for professional solar designers. But they usually cost $1,000 per license or annual subscription as their market is really a small niche. In addition to this cost barrier for schools, most of these tools do not necessarily cover education standards or support student learning. Thanks to the National Science Foundation, there is now a powerful free alternative for all students and teachers -- Energy3D. A one-stop shop for solar power design and simulation, Energy3D is an extremely versatile CAD tool that can be used to design rooftop solar solutions for not only average homes but also large buildings (you probably have also seen that it can be used to design utility-scale concentrated solar power stations as well). Importantly, Energy3D provides excellent 3D graphics, rich visualizations, and powerful analytical tools that support scientific inquiry and engineering design at fundamental levels. These features make Energy3D a perfect tool for engaging students and fostering learning.

Fig. 3: Solar irradiance map (June 22)
We are collaborating with Charlottesville High School (CHS) in Virginia to plan for a pilot test of the Solarize Your School project, in which students will learn science and engineering concepts and principles through designing large-scale solar panel arrays that achieve optimal cost effectiveness. To make sure every student has the same building to solarize, I sketched up an Energy3D model of CHS as shown in Figure 1 to provide to students as the starting point. If you want to do this for your own school, you can import a Google Map image of the school using the Geo-Location Menu in Energy3D. After the map image shows up in the view, you can draw directly on top of it to get the basic shape right. While it may not be possible to get the exact heights in Google Map, you can use the elevation data provided by Google Earth to calculate the heights of the walls and roofs.

CHS currently has six arrays of solar panels installed on their roof. Five arrays have 88 panels each and one has 10. The panels are arranged in three rows, with the portrait orientation and a tilt angle of 10 degrees (Figure 2). All the panels are 240W AP-240 PK from Advanced Solar Photonics (ASP). Their solar cell efficiency is 14.82%. Their temperature coefficient of Pmax (a property that measures the decrease of solar output when the temperature rises) is -0.4%/°C. Their size is 1650 x 992 x 50 mm. Each panel has three internal bypass diodes. The arrays use REFUsol string inverters to convert electricity from DC to AC, meaning that these arrays probably have little to no tolerance to shade and should be placed away as far from any tall structure as possible. I couldn't find the efficiency of the string inverters, so I chose 90% as it seems typical. I also didn't know the dust level in the area and the cleaning schedule, so I applied 5% of dust loss throughout the year (although the dust loss tends to be higher in the spring due to pollen). Since they went into operation on March 1, 2012, these panels have generated a total of 605 megawatt hours (MWh) as of September 8, 2016, amounting to an average of annual yield at 135 MWh.

Fig. 4: Prediction vs. reality.
I added these solar panel arrays to the Energy3D model with their parameters set for simulation. Figure 3 shows a heat map visualization of solar irradiance on June 22, indicating the ranges of major shading areas. Figure 4 shows the comparison of the predicted output and the actual output in the past 12 months. As some of the arrays were in maintenance for some time in the past year, I picked the highest-performing array and multiplied its output by five to obtain a number that would fairly represent the total yield in the ideal situation. Also note that as there is currently no weather data for Charlottesville, I picked the nearby Lynchburg, which is about 68 miles southwest of Charlottesville, as the location.

The prediction of the total output by Energy3D is a bit higher than the actual output in the past year (139 MWh vs. 130 MWh). If we compare the predicted result with the four-year average, the difference is less (139 MWh vs. 135 MWh). In terms of monthly trend, it seems Energy3D underestimates the winter outputs and overestimates the summer outputs. While the result may be satisfactory for educational use, we will continue to improve the fidelity of Energy3D simulations.