Real-time mid-wavelength infrared scene rendering with a feasible BRDF model

Real-time mid-wavelength infrared scene rendering with a feasible BRDF model

Infrared Physics & Technology 68 (2015) 124–133 Contents lists available at ScienceDirect Infrared Physics & Technology journal homepage: www.elsevi...

3MB Sizes 2 Downloads 14 Views

Infrared Physics & Technology 68 (2015) 124–133

Contents lists available at ScienceDirect

Infrared Physics & Technology journal homepage: www.elsevier.com/locate/infrared

Real-time mid-wavelength infrared scene rendering with a feasible BRDF model Wu Xin ⇑, Zhang Jianqi, Chen Yang, Huang Xi School of Physics and Optoelectronic Engineering, Xidian University, Xi’an, Shaanxi 710071, China

h i g h l i g h t s  A new BRDF model was proposed to be used in large scale scene simulation for mid-wavelength infrared.  The proposed BRDF was verified by both simulation and field experimental results.  A complete MWIR rendering system (PRISSE) was implemented with the proposed BRDF model and the atmospheric effects.

a r t i c l e

i n f o

Article history: Received 24 August 2014 Available online 3 December 2014 Keywords: Infrared simulation Large scale scene BRDF Real time rendering

a b s t r a c t Practically modeling and rendering the surface-leaving radiance of large-scale scenes in mid-wavelength infrared (MWIR) is an important feature of Battlefield Environment Simulation (BES). Since radiation transfer in realistic scenes is complex, it is difficult to develop real-time simulations directly from first principle. Nevertheless, it is crucial to minimize distortions in the rendering of virtual scenes. This paper proposes a feasible bidirectional reflectance distribution function (BRDF) model to deal with a large-scale scene in the MWIR band. Our BRDF model is spectrally dependent and evolved from previous BRDFs, and meets both Helmholtz reciprocity and energy conservation. We employ our BRDF model to calculate the direct solar and sky contributions. Both of them are added to the surface thermal emission in order to give the surface-leaving radiance. Atmospheric path radiance and transmission are pre-calculated to speed up the programming for rendering large scale scenes. Quantitative and qualitative comparisons with MWIR field data are made to assess the render results of our proposed method. Ó 2014 Elsevier B.V. All rights reserved.

1. Introduction Large scale scene rendering (LSSR) in thermal infrared is of growing interest in Battlefield Environment Simulation (BES) area [1–3]. The virtual vision technique has been widely applied to the development of infrared sensors and signal processing algorithms as well as games and movies [4,5]. Characteristics of the land surface in infrared are complex. Especially for MWIR, the reflectance of a large scale scene with diverse landforms, bidirectional reflectance distribution function (BRDF) model needs much more careful attention in rendering the scenes. It should meet basic optical principles as well as the requirement of real time in performance. In visual scene simulation community, the colorful profile and real-time performance are first concerned. Some plausible BRDF models were proposed to describe the reflectance properties of a surface with a few parameters of the nature materials for

⇑ Corresponding author. E-mail address: [email protected] (X. Wu). http://dx.doi.org/10.1016/j.infrared.2014.11.011 1350-4495/Ó 2014 Elsevier B.V. All rights reserved.

simplification [6,7], such as reflectivity and roughness. However, in MWIR bands, more thermal properties should be taken into account in order to achieve a feasible approach and quantitative results. BRDF was first proposed by Nicodemus [8] in 1970. Since then several BRDF models have been introduced to describe the directional distribution of reflectance flux. They are catalogued into three types, namely, semi-empirical model [9,10], measured data driven model [11], and analytical model [12]. Some of them have been successfully applied to scene illumination for Computer Graphics (CG), which is suitable for some true-to-life visual light scenes [13]. Here we concentrate on the physical plausible BRDF models in order to produce quantitative MWIR scenes to be assessed. One typical physically-based BRDF model is Torrance–Sparrow [14] model, which was further developed by Cook and Torrance [15] for CG. It introduced microfacet theory to build BRDF by assuming that the microfacet surface normal follows a Gaussian distribution. According to the microfacet theory, some new physically based BRDF models have been proposed. Ward BRDF model [16] is popular for its physical accuracy and good

125

X. Wu et al. / Infrared Physics & Technology 68 (2015) 124–133

computing performance for CG realization, which has been adopted by DIRSIG (http://dirsig.org) for multi-spectral composite image. Recently, a new Ward BRDF model [12] was developed to correct the energy conservation violation found in the improved Ward model, Ward-Dür BRDF model [17]. The new Ward model performs well at a grazing incident angle. It has been used to calculate the effective emissivity of isothermal blackbody cavities [18]. Our proposed BRDF model is based on the new Ward BRDF model in which the energy is conserved. The new Ward model has been verified meeting the basic principles of BRDF theory: Helmholtz reciprocity and energy conservation [19]. This is where we can carry on in our thermal infrared scene rendering. Nevertheless, the model was originally developed for visible band ranges and only a few reflectance spectra have been applied at a single surface. In MWIR bands (generally 3– 5 lm), the radiance received by instruments from objects is a mixture of self emission, solar reflectance (bidirectional reflection), sky reflectance (multi-directional reflection), and atmospheric transfer. Some BRDF models related to thermal infrared can be found in literatures [20–23]. The QUick Image Display (QUID) model [24], which employed Sanford–Robertson model [9], could render infrared targets at animation rate. However, very few of above mentioned models offered quantitative analyses or comparisons with field scene data for a large scale scene rendering. Observing the different reflective properties between mid-wavelength and longwavelength infrared scenes, we propose to use wavelength as a parameter to modulate the roughness, referred as effective roughness factor (ERF). We thus derive a new feasible BRDF model and verify it with field data quantitatively. This paper presents a novel BRDF model for large scale scenes with general rough surfaces in MWIR bands. It is based on thermal radiation theory and geometrical optics, which can be used to render various types of landforms in real time. Additionally, the ERF has been employed to enrich the details of a simulation scene by taking the atmospheric effects into account. As our model focuses on the rendering of large scale land surface, polarization and anisotropic are not considered here. The rest of this paper is organized as follows. The theory of our new BRDF model is described in Section 2. The derivation and integration for the radiation transfer equation are also presented in this section. In Section 3, both field and simulated experiments are conducted to verify the performance and feasibility of our proposed BRDF model. A conclusion is given in Section 4.

2. Theory 2.1. BRDF model As known that BRDF describes the distribution of the reflected energy, generally it involves three components: incident direction ðhi ; /i Þ, reflected direction ðhr ; /r Þ, and light wavelength k. Denote a ratio f BRDF as [8]:

f BRDF ðhi ; /i ; hr ; /r ; kÞ ¼

dLðhr ; /r Þ 1 ½sr ; dEðhi ; /i Þ

ð1Þ

where dL is the reflectance from an extended source (the sun) in the direction ðhi ; /i Þ; dE is the irradiance incident on a surface in the direction ðhi ; /i Þ. The relations and directions of the reflection are illustrated in Fig. 1 with symbol explanations listed in Table 1. The new Ward BRDF model [12] confirmed that it modified the previous Ward model to meet energy conservation at grazing angles. This can be seen from the specular reflection expressed in the formula: 2 q cos2 /h sin /h f newWard ¼ s  exp  tan2 d þ pab a2 b2



!!

2ð1 þ cos hi cos hr þ sin hi sin hr cosð/r  /i ÞÞ ðcos hi þ cos hr Þ4

:

ð2Þ

In Eq. (2), p affiffiffiand b are used to denote the material roughness that equal 2 times the standard deviation in the perpendicular T and ~ Pr of the 2D Gaussian distribution. It describes directions ~

Table 1 Symbols used to describe directional reflection presented in Fig. 1. Symbols

Meaning

Hi ; Hr

Solid angles of incident and reflected direction Incident and reflected angles Solid angles of incident, reflected, and halfway vector projection

hi ; hr ~ Pr ; ~ Ph Pi ; ~ /i ; /r ; /h ~ N

Azimuthal angles for incident, reflected and halfway vectors Mircofacet surface normal vector

d ~h N

Angle between surface normal and halfway vector Halfway vector between incident and reflected directions

~ T dEðo Hi Þ dLðo ! Hr Þ

~ and ~ Vector normal to N Pr Irradiance incident on the surface Radiance reflected into Hr

Fig. 1. Basic geometry of directional reflection.

126

X. Wu et al. / Infrared Physics & Technology 68 (2015) 124–133

the property of a surface in visual bands. However, it is insufficient to treat the material roughness without considering the effect of the wavelength in MWIR bands. As the light wavelength becomes wider, radiance will be reflected more easily between microfacets. As a result, the final radiant flux density per unit solid angle is changed on the infrared focal plane array (IFPA). Hence, it is necessary to modify the roughness factor to fit the BRDF in MWIR. In addition, for a large scale scene the reflectances of the sun and the atmosphere can not be ignored. It is important to take their effects into account in the multiple directional reflection [25]. We came up with a method to calculate the reflected radiance in MWIR, which is given as:

f irBRDF ðhi ; /i ; hr ; /r Þ ¼ Fðk; hs Þ  rðkÞ  Dðd; /h ; hs Þ:

ð3Þ

Our new infrared BRDF consists of three components. Fðk; hs Þ is the specular factor, a function of wavelength k and the halfway angle hs between the incident and reflected direction which is determined by ðhi ; /i ; hr ; /r Þ. Here, Schlick’s approximation of Fresnel’s reflection law is used [26],

Fðk; hs Þ ¼ qk þ ð1  qk Þð1  hs Þ5 ;

ð4Þ

where qk 2 ½0; 1 is reflection factor for wavelength k. The second function, rðkÞ, is the wavelength-related roughness. It is the inverse of Cauchy distribution:

" # ðk  k0 Þ2 þ c2 rðkÞ ¼ p :

ð5Þ

c

The location parameter k0 is the central wavelength in the integration of the rendering equation. The scale parameter c is used to describe the physical roughness, which is a constant for each type of surfaces. In spectroscopy, Cauchy distribution is used to describe the shape of spectral lines when the atoms interact with the surface [27]. We introduce it to modulate the roughness with wavelength. One can see that the larger the scale parameter c, the smoother the surface. In a large scale scene, the rugged ground is modeled by thousands of microfacets with normals being a Gaussian distribution [14]. Considering the energy reciprocity at grazing angle [12], we express the directional factor Dðd; /h ; hs Þ as:

Dðd; /h ; hs Þ ¼

1

p

 expð tan2 dÞ 

1 : hs cos4 d

4 cos2

ð6Þ

Thus, our proposed BRDF for infrared obeys the two basic principles of BRDF theory in addition to the introduction of the modulated roughness with wavelength. In the following sections, the proposed BRDF model will be used to derive the reflectance from the sun and the sky in order to form a feasible rendering equation for a infrared scene.

2.3. Sky reflectance and rendering equations Although the sky is usually considered as blackbody in midwavelength bands, it is necessary to derive the plausible equation for the physical process of reflection. This is because the sky has contributions to the infrared profile of an object. In LSSR the sky is modeled as a distant dome with homogeneous distribution of radiance as shown in Fig. 2. The sky is divided into millions of facets, each with a radiance LDX . Each facet radiates like a reflection mode of a conning angle in, a conning angle out. Integrating all the facets radiation over the surface, we obtain the reflection of the whole sky off the surface in different directions, which reflects like a reflection mode of a hemisphere in, a conning angle out. The exact expression is described as:

LrSky ¼

Z

k2

k1

Z 2p Z p=2 0

0

qð2p; hi ; /i Þ ¼

dUr ðhr ; /r Þ ; Ui ð2pÞ

qBRDF ð2p; hr ; /r Þ ¼

1

p

cos hr dXr

ð10Þ Z 2p

f BRDF cos hi dXi :

Lk1 k2 ¼ ðLemission þ Lreflect Þ  satmo þ Lpath :

ð12Þ

The atmospheric transmission, satmo , and the atmospheric radiance, Lpath , are calculated by MODTRAN under some different meteorological conditions [28]. Since we render the scene in a 3D space, both the two parameters are pre-calculated and their values are organized into a 3D data cubic textures. Denote Lemission and Lreflect as the self emission radiance and the reflectance of the sun and the

The reflectance of the sun plays an important role for the infrared profile of an object in the real world, which is depicted by the surface specular of BRDF. The radiance of the sun, LrSun , reflected at a point on a surface is calculated by an integration of wavelength:

Skydome

θi '

N

LrSky

k2

f irBRDF Esun ðkÞ cos hi dk:

ð7Þ

Facet

k1

Esun ðkÞ is the direct irradiance of the sun, hi is the angle between the incident of the sun and the normal of the object surface. Then the reflectance at ðhr ; /r Þ is modulated by f irBRDF described in Eq. (3). In f irBRDF , two components are related to k, so the integration equation can be reformulated as:

LrSun ðhi ; /i ; hr ; /r Þ ¼ Dðd; hs Þ  cos hi 

Z

k2

rðkÞ  Fðk; hs Þ

k1

 Esun ðkÞdk:

ð8Þ

ð11Þ

In Eq. (10), Ui and Ur are the incident and reflected radiant fluxes. The radiance of the sky reflected by a surface thus can be expressed completely. However, for LSSR all the radiance should be calculated in real time or dynamically. It is impossible to calculate the complex integration equations as well as render at the same time. We therefore pre-calculate the reflected radiance of the sky by MODTRAN (http://modtran5.com) and organize the data into some texture for the dynamical rendering process, which is described in Section 3. Finally, both reflections of the sun and the sky are modeled with our proposed BRDF model. The rendering equation is expressed as:

LΔΩ

Z

ð9Þ

where h and / represent the zenith and azimuth angle, respectively; h0i is the incident angle of each sky facet; qBRDF is the bidirectional reflectance ratio, which is derived from f BRDF :

2.2. Sun reflectance

LrSun ðhi ; /i ; hr ; /r Þ ¼

qBRDF LDX ðh; /Þ cos h0i sin hdhd/dk;

Fig. 2. The sky radiation model.

Table 2 Hardware used in the simulation experiments. Hardware

Specifications

CPU RAM GPU

IntelÒ Core™ i5 2.80 GHz 4.00 Gb NVIDIA GeForce GTX 590

X. Wu et al. / Infrared Physics & Technology 68 (2015) 124–133

sky. Lreflect is the sum of LrSun and LrSky (see Eqs. (8) and (9)), and Lemission is given by:

Lemission ¼

1

p

Z

k2

127

3. Experiments and validation 3.1. Simulation experiments

ek Mkbb dk:

ð13Þ

k1

The large scale background objects are treated as Lambert Body [29]. Mkbb stands for the spectral emittance in Planck’s law, and is calculated depending on the wavelength and the absolute surface temperature.

As opposed to QUID model that was based on the semi-empirical Sandford–Robertson model, we implemented our model (that was based on new Ward BRDF model) in a GPU-based render pipeline and construct a platform named physical reasonable infrared scene simulation engine (PRISSE) [25]. Table 2 lists the specifications of the hardware used for running PRISSE. PRISSE consists of

Visual Texture

Other Properties Roughness Direct Reflectance Emittance Temperature Material ID

Geometry Mesh Grid Fig. 3. Mapping textures onto geometry mesh grid.

Fig. 4. Real time rendering results of various landscapes (without atmosphere affection).

128

X. Wu et al. / Infrared Physics & Technology 68 (2015) 124–133

Zenith Angle

Distance (0 km, 0 km, 0°)

(10 km, 1860 km, 180°)

Fig. 7. Appearance of the MWIR imager used in our field experiment.

Altitude Fig. 5. Atmospheric transmittance data cube.

four main components: a Geometry Designer, a Texture Manager, an Atmospheric Calculator, and a Dynamic Render Engine. By loading the related infrared resources to the rendering pipeline, the 3D render engine executes the Cg (C for graphics) [30] scripts in parallel, containing the physical equations for the BRDF

calculations as described above. The first step of rendering the MWIR scenes is preparing all the models, textures and infraredproperty data. The second step is dynamically calculating the rendering results with GPU’s programmable pipeline. What follows is the implementation of PRISSE. Firstly, the geometry models are the basic elements for a 3D scene. They are generally created by some 3D modeling softwares such as Blender and 3Ds Max, while there is an attempt to retrieve

(a) The original scene only rendered with temperature

(b) Atmospheric effect added

(c) Affected by the sky reflectance based on (b)

(d) Affected by the sun reflectance based on (b)

(e) Final scene with all the effects involved

(f) Plots along with diagonal line

Fig. 6. Dynamic scene rendered by PRISSE under different circumstances.

129

X. Wu et al. / Infrared Physics & Technology 68 (2015) 124–133

N Medium-wave infrared imager Grass

Cement path

Cement platform

7.45m

A 42.9m

21m

B

C

D

E

3.3m

Fig. 8. Field experiment setup and measured spatial information.

Fig. 9. Field environment visual band picture.

the 3D scene geometry and textures automatically [31,32]. We here use Blender to design our scene map and construct some 3D geometry models. Blender is suitable for texturing the models, of which we take the advantage to map the infrared data onto the surface of the models. The mapping process is demonstrated in Fig. 3, where the physical properties of different materials are organized into texture formats. In PRISSE, the physical parameters of the surface are tiled into a data cube, including material ID, surface temperature, emittance, direct reflectance of the sun, micro roughness, and other properties or information to be extended. Each of the material properties is generated by the Texture Manager component. Besides, Texture Manager is also used to convert the

(a) By visual band camera

atmospheric data to textures and pre-calculate T–R (Temperature to Radiation) mapping table in MWIR bands. This makes it possible to dynamically fetch the radiation from temperature in GPU’s sampling process. The atmospheric data is calculated by Atmospheric Calculator through calling MODTRAN. Provided meteorological conditions (e.g., temperature, humidity, pressure, wind, etc.) as inputs, MODTRAN calculates atmospheric data such as path radiance and transmittance that are needed by each viewpoint in the scene. Given the coordinate of a viewpoint, MODTRAN was called to generate a data cube as shown in Fig. 5. The data cube consists of many pixels, each of which is indexed with altitude of sensor, distance to an object, and zenith angle that are retrieved by GPU in real-time rendering. All the thermal infrared data is ready for PRISSE to simulate the MWIR images, but it is necessary to change the input parameters (e.g., the position and orientation of the sun) and the virtual sensor to present how BRDF model affects the final radiance. Cg is hence involved to manipulate the vertex and fragment (pixel) data. This can make it easy for programmers to control the shape, appearance, sampling, and property of objects rendering using programmable graphics hardwares such as GPUs. Benefitting from the Cg shaders, we can calculate Eq. (12) with various input view angles and positions of the sun in parallel. To simulate the time-varying scene, it is important for Resource Manager to manage proper data at the right place. In the end, some rendered images obtained using our model, are presented with self-radiation, sun reflection, sky reflection, and atmospheric affection from a large scale scene, which is 100  100 square kilometers including vegetation covers,

(b) By MWIR imager

Fig. 10. Images taken at 13 o’clock on July 12, 2012 at Xidian University.

130

X. Wu et al. / Infrared Physics & Technology 68 (2015) 124–133

hilly lands, wheat fields, rivers and deserts. Fig. 4 shows the render results with different land covers at different time, mainly containing a river, an asphalt road, mountains, and wheat fields. To illustrate the changes of different landscapes with time, we rendered the scene in Fig. 4 without the effect of the atmosphere.

(a) Rendered at visible band

The gray level of the river and asphalt road changes significantly with time. As seen in Fig. 4b when the solar zenith is small, the asphalt road has a much higher temperature than the mountains and wheat fields, while at the same time the river is not as hot as the land because of its larger heat capacity. For the river the

(b) Rendered at MWIR band with BRDF model

Fig. 11. Simulated scenes based on the field data by PRISSE.

(a) Lawns labeled as ‘A’ in Fig. 11b

(c) A big tree labeled as ‘C’ in Fig. 11b

(b) small trees labeled as ‘B’ in Fig. 11b

(d) The shopping building labeled as ‘D’ in Fig. 11b

Fig. 12. Statistical radiance charts of different objects within 24 h.

X. Wu et al. / Infrared Physics & Technology 68 (2015) 124–133

speed of losing heat at night is also slower than the others, leading to a higher gray level at 23:00 as seen in Fig. 4d. This is consistent with what we learn in infrared physics. The scenes rendered in the morning and evening are similar, but actually all the objects in Fig. 4c have a higher temperature after a whole daytime sunlight shinning. This also agrees with what is presented in infrared scene. We also experimented on the large scale MWIR scene under different conditions to verify its correctness. Fig. 6a is the original rendering result that only shows the basic gray scale in radiance. It includes the cloudy sky, mountains, asphalt roads, a river, a power plant, some buildings, and a small village in the distance. For each experiment, Fig. 6f shows the gray scale in radiance versus the pixels along the diagonal line which is defined in Fig. 6a with a yellow arrowed line. It is noticed that the first 260 pixels have about the same gray scale value that is retrieved from the sky area from Fig. 6a–e. Based on the equations described in Section 2.3, the radiance data of the sky is pre-calculated by MODTRAN. During the processes, the atmospheric effects are included and the sky itself is thus not affected by the other factors. From Fig. 6f, one can see that atmospheric effects re-distribute the radiance of the scene greatly (see Fig. 6b), where the mountains are mixed up with the sky in the distance. However, the atmospheric effects are complicated and change a lot under different conditions, such as visibility, water vapor, and extinction coefficient et al. Here our atmospheric effects are pre-calculated using the U.S standard atmosphere in 1976 with a visibility at 23 km. The radiance of the sky affects all the objects in the scene with different intense according to the material properties. It can be seen in Fig. 6c that the increase of radiance on the asphalt roads and the plants covers are different regardless of viewing from which position or direction (Eq. (9)). Our new BRDF model for MWIR scene is shown in Fig. 6d, where the sun reflectance with BRDF calculation in real time is added. Along with the atmospheric effects, the radiation of the mountains varies with different locations and orientations, and the buildings are obviously shadowed on different sides, which make it more real in MWIR scenes. Also, form Fig. 6f and d, we can see the BRDF model affects the radiance quite differently when comparing to the sky model. Finally, with considering all the effects we mentioned above, Fig. 6e shows the dynamically rendered MWIR scenes in a physical plausible profile. The assessments on the rendering results will be discussed in the next section by comparing to the field MWIR data collected with a real MWIR imager. 3.2. Field experiments and assessments Before PRISSE is applied to the BES field (e.g., the hardware in the loop testing system), it is necessary to validate the reflection models and rendering results of PRISSE to ensure that the simulated MWIR images are operable for applications. A 24-h field experiment was used to test a complex scene with a MWIR imager manufactured by Beijing Aerospace Changfeng Co., Ltd. The MWIR imager was operated at 3 to 5 lm and has a field of view (FOV) at 13.75°  11°. Equipped with Stirling Coolers, the detector focal plane array (FPA) is composed of 320  256 unit pixels, each of which is 30 lm  30 lm in size. Fig. 7 shows how the MWIR imager looks like. The MWIR imager was fixed on the top of the concrete platform, looking at the square located in the center of Xidian University. The orientation and position of the MWIR imager was carefully measured as shown in Fig. 8. From the visual band picture displayed in Fig. 9, we can see that the Etiquette Square is mainly covered by lawns and marble bricks. The five lawns were labeled as letters, ‘‘A, B,. . ., E’’, which are separated by some narrow concrete paths. The MWIR imager was set up at the square, as presented with a red point in Fig. 9. The primary background objects were the sky,

131

two buildings, some street lights, and some trees. Fig. 10 is one of the 24-h images shot at 13 o’clock on July 12, 2013, taken by a normal visual band CCD camera and MWIR imager respectively. After setting up the field experiment platform, we took several MWIR images every hour. At same time, the temperatures of the five lawns, concrete paths, and air are recorded by a thermometer, which are used as the input texture parameters in PRISSE. For the atmosphere parameters, we selected latitude 34.130 degrees north and longitude 108.830 degrees east with a default visibility distance at 23 km. Because this field experiment was for verifying the performance of our MWIR BRDF model, the rendered scenes were synthesized at different time and were mainly affected by the sun direction and temperature, where the atmospheric effects were not obvious (see Fig. 11b). After we obtained the field MWIR images and the corresponding simulated images, four objects are selected for radiance comparison, which are labeled as ‘‘A, B, C, D’’ in Fig. 11b. Since we implemented PRISSE in a real time mode, all the rendered data got changed either when the virtual camera was moved or when the position of the sun was updated. In the time-varying simulation, the camera was located at fixed position and direction. To compare the two sets of data, we need to translate the data to a compatible unit. Our MWIR imager outputs the infrared images as voltage with a 14-bit integer, while our simulated images are quantized as gray scale from 0 to 255. Thus we normalized them

(a) Filed data results

(b) Simulating results Fig. 13. Temperatures of the walls in field and simulating experiments.

132

X. Wu et al. / Infrared Physics & Technology 68 (2015) 124–133

before making plots in a chart, which is marked as ‘‘Normalized Radiance’’ in Fig. 12. In Fig. 12, ‘A’ is the area of the five lawns on the ground, ‘B’ stands for small trees beside the lawns, ‘C’ represents a big tree with dense canopy in front of the shopping building marked with ‘D’. A square of 4  4 pixels in the rendered images is taken to obtain the average for each area. From Fig. 12, we can see that all the simulations have a similar trend with the field data, especially for the lawns (see Fig. 12a). It demonstrates a good profile of our BRDF model as compared to the field data. We attribute this to the flatness and uniformity of the grass surface in the real scene, which is easy for Blender to make an exactly similar model. As for Fig. 12b, there exists some mismatches on local points. This can be explained by the different densities of the trees of the simulation scene and real trees. For example, our experiments generated different images for different areas of trees according to the heterogenous densities. Water vapor and wind flow among the leaves were affected by the heterogenous densities, resulting in heterogenous radiance distributions. This can also explain the shifted curve in Fig. 12d for the big tree. In the real time rendering, both the small and big trees are time consuming in calculating their BRDF radiance due to the tens of thousands of leaves are involved. In our simulation, the radiance of the leaves is accumulated in a linear mode and is not ideal in a real world. One thus can see from Fig. 11b that the leaves shared

the same radiance and is worth further improvements in the future. However, in Fig. 12c on the brick wall of the shopping building, the radiance behaves similar to the field data, where the reflectance of the sun is obvious at about 13 o’clock in the afternoon. To see the effect of our BRDF model much clearer, we tested the two sides of the shopping building wall in different orientations, one facing south and the other facing east. As the red arrowed line pointed in Fig. 10b, the east side wall received the sunlight more in the morning, while the south side wall is exposed to the sun for a long time. What needs to be mentioned is that in Fig. 13 temperature in celsius is used to measure the intense of the radiance, which are converted from the field raw data with calibrated black body data. This means that the temperatures of the walls are directly related to the radiance on the surface that include selfemission and reflection. From the field data in Fig. 13a, we can thus see that the east side wall get warm early in the morning around 10 o’clock, followed by arriving at its highest point at 15 o’clock in the afternoon. Also the south wall reached its highest temperature at about 15 o’clock with higher temperature than the east one. In addition, the measured air temperature (also see Fig. 13a) marked in blue square symbol shows that the temperature of the wall is lower than that of the air except for about 11 o’clock and 17 o’clock. In Fig. 13b, the simulated results were normalized to

Fig. 14. A gallery of the rendered MWIR images for large scale scenes.

X. Wu et al. / Infrared Physics & Technology 68 (2015) 124–133

the corresponding field data using equalization. It can be seen that the east wall was warmed first around 9 o’clock, same as the field data (see Fig. 13a), and reached its highest temperature at 11 o’clock. At the same time the south wall kept being warmed until 13 o’clock and dropped since then. This indicates that both the simulated south and east walls behaved similar to the field ones, not only on the self-emission but also on the BRDF reflection. As shown in Fig. 13b, although the east wall had a relative low temperature from 14 to 18 o’clock, which is quite different with the field data, the shifted feature of statistic results presented in Fig. 12d indicates that the orientation between the wall and the sun influences the simulation a lot. Finally, we present a gallery of MWIR LSSR images by PRISSE with real time performance in Fig. 14. 4. Conclusions We proposed a new BRDF model for a large scale mid-wavelength infrared scene rendering. The rendering equations were derived based on optical principles. In addition to considering the atmospheric effects, the sun and the sky are also taken into account in the reflection. With extendable interface to various lighting models, physical reasonable infrared scene simulation engine (PRISSE) has been implemented on GPUs to perform the real time rendering thanks to GPU’s parallel architecture. The verification of our BRDF model has been fulfilled using simulations by PRISSE and field data by a real MWIR imager. Some statistic results are also analyzed to explore the weakness of our model that will be improved in the near future. Given that we have constructed the GPU-based PRESSE for real time MWIR large scale scene rendering involved with our new BRDF model for various landscapes, we will continue to explore the applications of the dynamic infrared images simulation. We will also improve the render performance in order to be suitable for some hardware in the loop requirements (200 HZ). This will benefit the instrument testing and manufactures. Conflict of interest No conflict of interest exits in the submission of this manuscript. Acknowledgements This work is supported by the Natural Science Foundation of China under Grant No. 61301290. The authors would like to thank Dr. Melin Huang and Zhiwei Ye of Space Science and Engineering Center at University of Wisconsin-Madison for a critical and careful reading of the manuscript. The authors would also like to thank the anonymous reviewers for their constructive comments. References [1] W. Yu, W.W. Dong, W.R. Feng, Designing system for battlefield environment simulation by UML, in: 2012 International Conference on Industrial Control and Electronics Engineering (ICICEE), 2012, pp. 1977–1980.

133

[2] Y. Long, Q. Gao, Z. Zhang, J. Yuan, Y. Wei, Summarization of virtual battlefield environment technology, in: AsiaSim 2012, Springer, 2012, pp. 420–428. [3] J. Schott, A. Gerace, S. Brown, M. Gartley, M. Montanaro, D.C. Reuter, Simulation of image performance characteristics of the landsat data continuity mission (LDCM) thermal infrared sensor (TIRS), Remote Sens. 4 (8) (2012) 2477–2491. [4] L. Szirmay-Kalos, T. Umenhoffer, B. Toth, L. Szecsi, M. Sbert, Volumetric ambient occlusion for real-time rendering and games, IEEE Comput. Graph. Appl. 30 (1) (2010) 70–79. [5] S.R. Lach, Semi-automated dirsig scene modeling from 3d lidar and passive imagery (Ph.D. dissertation), Rochester Institute of Technology, College of Science, Center for Imaging Science, Rochester, New York, United States, 2008. [6] T. Akenine-Möller, E. Haines, N. Hoffman, Real-Time Rendering, Natick, MA, USA 2008. [7] H. Nguyen, Gpu Gems 3, Addison-Wesley Professional, 2007. [8] F.E. Nicodemus, Reflectance nomenclature and directional reflectance and emissivity, Appl. Opt. 9 (6) (1970) 1474–1475. [9] B.P. Sandford, D.C. Robertson, Infrared reflectance properties of aircraft paints, U.S. Air Force Research Laboratory Preprint, AFRL/VSBT ESC-94-1004, August, 1994. [10] X. Jin, R.Y. Levine, Bidirectional reflectance distribution function effects in ladar-based reflection tomography, Appl. Opt. 48 (21) (2009) 4191–4200. [11] J.R. Maxwell, Bidirectional Reflectance Model Validation and Utilization, PN, 1973. [12] D. Geisler Moroder, A. Dür, A new ward BRDF model with bounded albedo, Comput. Graph. Forum 29 (4) (2010) 1391–1398. [13] P. Dutré, Global illumination compendium, Cornell University, 2003. [14] K.E. Torrance, E.M. Sparrow, Theory for off-specular reflection from roughened surfaces, JOSA 57 (9) (1967) 1105–1112. [15] R.L. Cook, K.E. Torrance, A reflectance model for computer graphics, ACM Trans. Graph. (TOG) 1 (1) (1982) 7–24. [16] G.J. Ward, Measuring and modeling anisotropic reflection, ACM SIGGRAPH Comput. Graph. 26 (2) (1992) 265–272. [17] A. Dür, An improved normalization for the ward reflectance model, J. Graph. GPU Game Tools 11 (1) (2006) 51–59. [18] A. Prokhorov, Effective emissivities of isothermal blackbody cavities calculated by the Monte Carlo method using the three-component bidirectional reflectance distribution function model, Appl. Opt. 51 (13) (2012) 2322–2332. [19] R.R. Lewis, Making shaders more physically plausible, Comput. Graph. Forum 13 (2) (1994) 109–120. [20] J.L. Miller, Multispectral infrared BRDF forward-scatter measurements of common black surface preparations and materials – or how black is black in the IR?, Def Secur. 5405 (2004) 25–35. [21] B. Balling, A Comparative Study of the Bidirectional Reflectance Distribution Function of Several Surfaces as a Mid-wave Infrared Diffuse Reflectance Standard, Tech. rep., 2009. [22] W.C. Snyder, Z.M. Wan, Y.L. Zhang, Y.Z. Feng, Thermal infrared (3-14 lm) bidirectional reflectance measurements of sands and soils, Remote Sens. Environ. 60 (1) (1997) 101–109. [23] R.L. Sundberg, J.H. Gruninger, M. Nosek, J. Burks, E. Fontaine, Quick image display (QUID) model for rapid real-time target imagery and spectral signatures, vol. 3084, 1997, pp. 272–281. [24] T. Perkins, R. Sundberg, J. Cordell, Z. Tun, M. Owen, Real-time target motion animation for missile warning system testing, 2006, pp. 62080E–12. [25] X. Huang, J. Zhang, S. Zhang, X. Wu, GPU-based high-precision real-time radiometric rendering for IR scene generation, Infrared Phys. Technol. 65 (2014) 134–143. [26] C. Schlick, A customizable reflectance model for everyday rendering, 1993, pp. 73–83. [27] E. Hecht, Optics, fourth ed., Addison-Wesley, Reading, Mass, 2001. [28] Y. Dong, X. Zhou, The progress of the atmospheric infrared emission models and practical codes, Laser Infrared 33 (6) (2003) 412–416. [29] C.J. Willers, M.S. Willers, F. Lapierre, Signature modelling and radiometric rendering equations in infrared scene simulation systems, Secur. Def. 8187 (2011). 81870R–81870R–16. [30] W.R. Mark, R.S. Glanville, K. Akeley, M.J. Kilgard, Cg: a system for programming graphics hardware in a C-like language, ACM Trans. Graph. (TOG) 22 (3) (2003) 896–907. [31] V.A. Cicirello, W.C. Regli, A flexible and extensible approach to automated CAD/CAM format classification, Comput. Graph. 37 (5) (2013) 484–495. [32] R. Lakemond, C. Fookes, S. Sridharan, Evaluation of two-view geometry methods with automatic ground-truth generation, Image Vis. Comput. 31 (12) (2013) 921–934.