{"id":531,"date":"2017-09-06T21:44:21","date_gmt":"2017-09-06T21:44:21","guid":{"rendered":"https:\/\/beta.research.ece.ncsu.edu\/osl\/?page_id=531"},"modified":"2018-10-16T18:58:27","modified_gmt":"2018-10-16T18:58:27","slug":"imaging-polarimeter","status":"publish","type":"page","link":"https:\/\/research.ece.ncsu.edu\/osl\/imaging-polarimeter\/","title":{"rendered":"Imaging Polarimeter"},"content":{"rendered":"<h2>1.0\u00a0\u00a0\u00a0 Introduction<\/h2>\n<p>The goal of this instrumentation tutorial is to create and calibrate a Stokes imaging polarimeter. This requires a linear polarizer (polymer film will work) and a camera. Using a webcamera, cell phone, a point and shoot, or digital SLR camera, you will be able to collect image data. These data can be collected for any given scene using different polarizer orientations. You will then learn how to reduce the data and calculate calibrated (or approximately calibrated) Stokes parameters. I say \u201capproximately\u201d because most of these data will be collected using hand-held polarizers, and you can only position them so accurately!<\/p>\n<h2>2.0\u00a0\u00a0\u00a0 Preliminary Setup<\/h2>\n<p>You will need to download two computer applications for this project. The first one is a small (free) application called \u201cIris\u201d. It can be downloaded here: <a href=\"http:\/\/www.astrosurf.com\/buil\/us\/iris\/zip\/iris.zip\">http:\/\/www.astrosurf.com\/buil\/us\/iris\/zip\/iris.zip<\/a>. The documentation is located here: <a href=\"http:\/\/www.astrosurf.com\/buil\/iris\/nav_pane\/CommandsFrame.html\">http:\/\/www.astrosurf.com\/buil\/iris\/nav_pane\/CommandsFrame.html<\/a>. This is a very powerful image processing software, mainly geared for (and created by) amateur astronomers \u2013 but it works great for other purposes too, and can be also used to process raw data from digital SLR cameras. You can also download and install Matlab or gain access to a computer that has it. Note that the required Matlab processing is fairly minimal, as much of it will be done in Iris (<em>Note: for the purposes of the tutorial, we can use IRIS exclusively since Matlab is not free<\/em>). Alternatively, <a href=\"https:\/\/www.gnu.org\/software\/octave\/download.html\">Octave<\/a> is like Matlab and is open-source. Much of the syntax is similar, so any discussion about Matlab can be applied to Octave as well.<\/p>\n<p>&nbsp;<\/p>\n<h2>4.0\u00a0\u00a0\u00a0 Configuration 1 \u2013 Linear Polarimeter<\/h2>\n<p>The first configuration that will be investigated is that of a simple linear polarimeter. In this case, it will consist of a rotating linear polarizer in front of a camera. The general configuration is shown below:<\/p>\n<p style=\"text-align: center;\"><a href=\"https:\/\/research.ece.ncsu.edu\/osl\/imaging-polarimeter\/im1\/\" rel=\"attachment wp-att-532\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-532\" src=\"https:\/\/research.ece.ncsu.edu\/wp-content\/uploads\/sites\/27\/2017\/09\/im1.jpg\" alt=\"\" width=\"647\" height=\"256\" srcset=\"https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/im1.jpg 647w, https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/im1-300x119.jpg 300w\" sizes=\"auto, (max-width: 647px) 100vw, 647px\" \/><\/a><\/p>\n<p>For this polarimeter configuration, it is recommended that you perform the following steps:<\/p>\n<h3>4.1\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Collect Practice Data and Familiarize Yourself with Data Reduction Procedure<\/h3>\n<ol>\n<li><strong>Acquire 3 images of an indoor scene<\/strong>. This will allow you to collect some data so that you can practice the procedure. Ideally, hold the camera as still as possible to make image registration easier. One image should be taken through the rotating polarizer at 0 degrees, one image taken through the polarizer at approximately 45 degrees, and the final image should be acquired through the linear polarizer at 90 degrees. You can do this by holding the polarizer by hand in front of your chosen camera. Some tips are as follows:\n<ol>\n<li>DIGITAL SLR: If you are using a digital SLR, put the camera on manual exposure (so fix the exposure speed AND focal ratio of the lens to something with acceptable dynamic range) and fix the white balance to daylight (actually, it does not matter what \u2013 just so long as it is fixed). For now, simply acquire data as JPG.<\/li>\n<li>SMARTPHONE: If you are using a smartphone, download a (preferably free) camera application that allows you to at least view the exposure speed, if not set it manually. The key is that you do not want the camera to automatically adjust the image brightness (gain) or to change the exposure speed during the course of your measurements. If you are having trouble with this, you may need to image the scene with some kind of diffuse target in it (like a piece of paper) that can be used to normalize the intensity of all the images. Ideally this target would also not produce a strong polarization signature. If you have an android phone, the app \u201cCamera FV-5 Lite\u201d works well: <a href=\"https:\/\/play.google.com\/store\/apps\/details?id=com.flavionet.android.camera.lite\">https:\/\/play.google.com\/store\/apps\/details?id=com.flavionet.android.camera.lite<\/a>. The free version restricts resolutions to 640&#215;480 pixels, but this is sufficient for our purposes.<\/li>\n<li>POINT AND SHOOT: For point and shoot cameras, the control over the exposure, focal ratio, white balance, etc. is highly varied. Try to place the camera in \u201cas manual\u201d a mode as possible. For instance, many cameras have aperture priority or shutter priority modes \u2013 these can be better than full automatic. Additionally, most have a way to manually set the white balance, so you will want to enable this option if available. Finally, some cameras have \u201cpicture modes\u201d (somewhat obscure modes on the cameras, like \u201cnight shot\u201d, etc.) that can enable manual features. Please see me if you are having difficulty maintaining the exposure and\/or focal ratio from shot to shot.<\/li>\n<li>WEBCAM: For most web camera applications, you have control over the exposure, gain, white balance, etc. So you should be able to control the exposure. But again this can vary from camera-to-camera.<\/li>\n<\/ol>\n<\/li>\n<\/ol>\n<p>Of these options, the digital SLR is the suprior, followed by the webcam and\/or point-and-shoot, with the smartphone coming in last. For best results, as much control over the exposure, white balance, exposure bracket, etc. is needed.<\/p>\n<p>Below is an example of 3 images that I took using the FV-5 Lite camera application using my android cellphone camera. This is a 3 exposure series of my office desk (super exciting!).<\/p>\n<p>&nbsp;<\/p>\n<p><a href=\"https:\/\/research.ece.ncsu.edu\/osl\/imaging-polarimeter\/image2\/\" rel=\"attachment wp-att-533\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-533\" src=\"https:\/\/research.ece.ncsu.edu\/wp-content\/uploads\/sites\/27\/2017\/09\/Image2.jpg\" alt=\"\" width=\"947\" height=\"279\" srcset=\"https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image2.jpg 947w, https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image2-300x88.jpg 300w, https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image2-768x226.jpg 768w, https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image2-800x236.jpg 800w\" sizes=\"auto, (max-width: 947px) 100vw, 947px\" \/><\/a><\/p>\n<p>Note the reflection of the wall on the desk \u2013 it is decreasing its intensity as the linear polarizer is rotated from 0 degrees (the reflected s polarized light from the desk transmits) to 90 degrees (the reflected s polarized light from the desk gets blocked). The 45 degree position is somewhere in between.<\/p>\n<ol start=\"2\">\n<li><strong>Transfer the files to your computer into a dedicated directory<\/strong> (e.g., C:\\tempimages\\linearfiles\\). Within this directory, create a temporary folder (e.g., C:\\tempimages\\lienarfiles\\tmp\\). This folder is where Iris will store its temporary and intermediate image files.<br \/>\n<a href=\"https:\/\/research.ece.ncsu.edu\/osl\/imaging-polarimeter\/image3\/\" rel=\"attachment wp-att-534\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-534\" src=\"https:\/\/research.ece.ncsu.edu\/wp-content\/uploads\/sites\/27\/2017\/09\/Image3.jpg\" alt=\"\" width=\"351\" height=\"312\" srcset=\"https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image3.jpg 351w, https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image3-300x267.jpg 300w, https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image3-350x312.jpg 350w\" sizes=\"auto, (max-width: 351px) 100vw, 351px\" \/><\/a><\/li>\n<li><strong>Open Iris<\/strong>. Within Iris, select \u201cFile -&gt; Settings\u201d and select \u201cWorking Path\u201d. Copy and paste the path to the temporary directory to this location (e.g., C:\\tempimages\\lienarfiles\\tmp\\). Also select FTS under \u201cFile Type\u201d. FTS or FITS files are used commonly by NASA and can be loaded easily into Matlab. Click OK to close the dialog box.<\/li>\n<li><strong>Now open the first image that you saved<\/strong> (say, your LP0 image). This is straightforward and can be done from the menu bar \u2013 when opening, be sure to select \u201cFiles of type:\u201d and select \u201cgraphics\u201d.<\/li>\n<li><strong>Open the command line<\/strong>. This is the small text looking button on the menu bar:<br \/>\n<a href=\"https:\/\/research.ece.ncsu.edu\/osl\/imaging-polarimeter\/image4\/\" rel=\"attachment wp-att-535\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-535\" src=\"https:\/\/research.ece.ncsu.edu\/wp-content\/uploads\/sites\/27\/2017\/09\/Image4.jpg\" alt=\"\" width=\"423\" height=\"73\" srcset=\"https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image4.jpg 423w, https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image4-300x52.jpg 300w\" sizes=\"auto, (max-width: 423px) 100vw, 423px\" \/><\/a><\/li>\n<\/ol>\n<p>This will open the following dialog:<br \/>\n<a href=\"https:\/\/research.ece.ncsu.edu\/osl\/imaging-polarimeter\/image5\/\" rel=\"attachment wp-att-536\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-536\" src=\"https:\/\/research.ece.ncsu.edu\/wp-content\/uploads\/sites\/27\/2017\/09\/Image5.jpg\" alt=\"\" width=\"387\" height=\"164\" srcset=\"https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image5.jpg 387w, https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image5-300x127.jpg 300w\" sizes=\"auto, (max-width: 387px) 100vw, 387px\" \/><\/a><\/p>\n<ol start=\"6\">\n<li><strong>Now you will save the images as FTS files<\/strong>. These will be saved in the tmp folder for future use. Simply type \u201csave im1\u201d to save the first image as an FTS.<\/li>\n<li><strong>Repeat steps 4 and 6 on the other two images (LP45 and LP90)<\/strong>. Save LP45 as <em>im2<\/em> and LP90 as <em>im3<\/em>.<\/li>\n<li><strong>Now you will convert the images to grayscale by summing the RGB values for each pixel<\/strong> (this is the inherent way that Iris creates the conversion). We will then divide the image by 3 so that we retain the previous dynamic range (0 to 255 in most cases). To convert to grayscale, first open im1 by typing<br \/>\n<strong><em>load im1<br \/>\n<a href=\"https:\/\/research.ece.ncsu.edu\/osl\/imaging-polarimeter\/image6\/\" rel=\"attachment wp-att-537\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-537\" src=\"https:\/\/research.ece.ncsu.edu\/wp-content\/uploads\/sites\/27\/2017\/09\/Image6.jpg\" alt=\"\" width=\"496\" height=\"602\" srcset=\"https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image6.jpg 496w, https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image6-247x300.jpg 247w\" sizes=\"auto, (max-width: 496px) 100vw, 496px\" \/><\/a><br \/>\n<\/em><\/strong><\/li>\n<\/ol>\n<p>in the command line. Then, convert the image to grayscale by typing:<\/p>\n<p><em>col2bw [INPUT] [OUTPUT] [NUMBER]<\/em><\/p>\n<p>In this case, input is the [input] is the input image\u2019s PREFIX (in this case, <em>im<\/em>), the [output] is the output image\u2019s PREFIX (something other than <em>im<\/em>, say, <em>imo<\/em>), and number is the number of images that you want to convert within the series (in this case, 3). So we would type:<\/p>\n<p><strong><em>col2bw im imo 3<\/em><\/strong><\/p>\n<p>Iris will perform the batch conversion, yielding the over saturated grayscale image below. Note that this is only over saturated because in IRIS the default maximum value is 255. You can select \u201cAuto\u201d on the \u201cthreshold\u201d toolbar and it will rescale the image\u2019s dynamic range (remember, the RGB values were ADDED together, so the maximum value in the image can now be 255*3).<\/p>\n<p><a href=\"https:\/\/research.ece.ncsu.edu\/osl\/imaging-polarimeter\/image7\/\" rel=\"attachment wp-att-538\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-538\" src=\"https:\/\/research.ece.ncsu.edu\/wp-content\/uploads\/sites\/27\/2017\/09\/Image7.jpg\" alt=\"\" width=\"455\" height=\"532\" srcset=\"https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image7.jpg 455w, https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image7-257x300.jpg 257w, https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image7-350x410.jpg 350w\" sizes=\"auto, (max-width: 455px) 100vw, 455px\" \/><\/a><\/p>\n<ol start=\"9\">\n<li>Now divide the images by 3. First, the images must be opened. Simply type<\/li>\n<\/ol>\n<p><strong><em>load imo1<\/em><\/strong><\/p>\n<p>To load the first image into the workspace. Then click on \u201cProcessing\u201d and select \u201cDivide\u201d and under \u201cValue\u201d enter 3. This will produce an image that is bounded by 0 to 255, as was the case in the original image. Once this is completed, type in the command line:<\/p>\n<p><strong><em>save imo1<\/em><\/strong><\/p>\n<p>Which will simply replace the active image with the previous image on the hard drive.<\/p>\n<ol start=\"10\">\n<li>Perform the previous step for images <em>imo2<\/em> and <em>imo3<\/em> such that all 3 images have been divided by 3.<\/li>\n<li>Finally, you need to take your 3 images and perform spatial image registration. This is best done to better than 1\/10<sup>th<\/sup> of a pixel (which is actually quite hard to do). Iris can do a pretty good job of registering images, which is the primary reason why we are using it. In the command line, type<\/li>\n<\/ol>\n<p><em>open imo1<\/em><\/p>\n<p>to load the first image into the workspace. Now select an area in the middle of your image (it can also be slightly decentered). The key is that it must have spatial information within it \u2013 so areas with edges, etc. are good in my image, but the area around the desk would be a poor choice. An example is shown below:<\/p>\n<p><a href=\"https:\/\/research.ece.ncsu.edu\/osl\/imaging-polarimeter\/image8\/\" rel=\"attachment wp-att-539\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-539\" src=\"https:\/\/research.ece.ncsu.edu\/wp-content\/uploads\/sites\/27\/2017\/09\/Image8.jpg\" alt=\"\" width=\"560\" height=\"630\" srcset=\"https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image8.jpg 604w, https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image8-267x300.jpg 267w\" sizes=\"auto, (max-width: 560px) 100vw, 560px\" \/><\/a><\/p>\n<p>Now select \u201cProcessing\u201d and choose \u201cPlanetary Registration (1)\u201d. This will open the following dialog:<\/p>\n<p><a href=\"https:\/\/research.ece.ncsu.edu\/osl\/imaging-polarimeter\/image9\/\" rel=\"attachment wp-att-540\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-540\" src=\"https:\/\/research.ece.ncsu.edu\/wp-content\/uploads\/sites\/27\/2017\/09\/Image9.jpg\" alt=\"\" width=\"550\" height=\"307\" srcset=\"https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image9.jpg 550w, https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image9-300x167.jpg 300w\" sizes=\"auto, (max-width: 550px) 100vw, 550px\" \/><\/a><\/p>\n<p>In this box, type \u201cInput Generic name\u201d as <em>imo<\/em> (again, this is the INPUT prefix), leave the sub-image size as 256 pixels, and under \u201cOutput generic name\u201d type <em>imr<\/em>. Finally, for number, select 3 since we have 3 images (e.g., imo1, imo2, and imo3). If we had 10 images, we would simply type 10 here \u2013 so this is just dependent on how many images you have in your current sequence. Clicking OK will register the images and save the spatially registered output images in imr1, imr2, and imr3.<\/p>\n<ol start=\"12\">\n<li><strong>Congrats! You have navigated your way through IRIS<\/strong>. The program is very powerful for processing images. It also has some good noise filtering functionality and it can also handle RAW images from digital SLR cameras very very well!<\/li>\n<li><strong>MATLAB for Stokes parameter calculation (Optional &#8211; Note you can also do this in Octave)<\/strong>: Now you can transition into using Matlab. You could technically continue to use IRIS for the rest of this, but it may be more cumbersome than it needs to be. In any case, the Matlab script is straightforward and mainly requires that you input the correct measurement matrix (<strong>W<\/strong>). The Matlab code that I have prepared is provided in Appendix A. The code is generally setup as follows:\n<ol>\n<li>Load the 3 images into an array \u201cSamples\u201d that is Nx x Ny x Ns, where Nx and Ny are the number of pixels in x and y for your particular camera and Ns is the number of images (in this case, 3).<\/li>\n<li>Create the W matrix. In this case, a function \u201clpr\u201d is used, which is a custom function of mine. This simply takes three inputs, the orientation of the diattenuator\u2019s transmission axis in degrees, the transmission of the x axis, and the transmission of the y axis, and provides you with a Mueller matrix of the diattenuator.<\/li>\n<li>Apply the W matrix inversion technique to EACH PIXEL of the sample data.<\/li>\n<li>Once completed, the Stokes vector is stored in a matrix S which is Nx x Ny x 4 with the implication that the S3 component is always 0 (this is a linear polarimeter).<\/li>\n<li>The DOLP is then calculated, along with the normalized Stokes parameters. These can then be displayed with Matlab.<\/li>\n<\/ol>\n<\/li>\n<li><strong>IRIS for Stokes parameter calculation<\/strong>: If you really want to use IRIS to calculate the Stokes parameter images, you can. However, you will need to determine the pseudoinverse of the <strong>W<\/strong> matrix before you do so. In this case, W is:<\/li>\n<\/ol>\n<p>W =<\/p>\n<p>0.5050\u00a0\u00a0\u00a0 0.4950\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 0<\/p>\n<p>0.5050\u00a0\u00a0\u00a0 0.0000\u00a0\u00a0\u00a0 0.4950\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 0<\/p>\n<p>0.5050\u00a0\u00a0 -0.4950\u00a0\u00a0\u00a0 0.0000\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 0<\/p>\n<p>&nbsp;<\/p>\n<p>And the pseudoinverse PW is:<\/p>\n<p>PW =<\/p>\n<p>0.9901\u00a0\u00a0\u00a0 0.0000\u00a0\u00a0\u00a0 0.9901<\/p>\n<p>1.0101\u00a0\u00a0\u00a0 0.0000\u00a0\u00a0 -1.0101<\/p>\n<p>-1.0101\u00a0\u00a0\u00a0 2.0202\u00a0\u00a0 -1.0101<\/p>\n<p>0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 0<\/p>\n<p>From PW, we can apply the coefficients in IRIS using the multiply command. The first row of PW calculates S0, the second row calculates S1, third row S2, and fourth row S3. Each coefficient multiples imr1, imr2, and imr3. Thus, to calculate S0, we can perform the following in the command window below:<\/p>\n<p><a href=\"https:\/\/research.ece.ncsu.edu\/osl\/imaging-polarimeter\/image10\/\" rel=\"attachment wp-att-541\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-541\" src=\"https:\/\/research.ece.ncsu.edu\/wp-content\/uploads\/sites\/27\/2017\/09\/Image10.jpg\" alt=\"\" width=\"525\" height=\"724\" srcset=\"https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image10.jpg 572w, https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image10-217x300.jpg 217w\" sizes=\"auto, (max-width: 525px) 100vw, 525px\" \/><\/a><\/p>\n<p>In this case, im1 is loaded, multiplied by 0.9901, saved as pr01. Im3 is then loaded, multiplied by 0.9901, saved as pr03. Then pr01 is loaded and added to pr03 and saved as S0.<\/p>\n<p>Similarly, S1 can be calculated:<\/p>\n<p><a href=\"https:\/\/research.ece.ncsu.edu\/osl\/imaging-polarimeter\/image11\/\" rel=\"attachment wp-att-542\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-542\" src=\"https:\/\/research.ece.ncsu.edu\/wp-content\/uploads\/sites\/27\/2017\/09\/Image11.jpg\" alt=\"\" width=\"529\" height=\"773\" srcset=\"https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image11.jpg 529w, https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image11-205x300.jpg 205w\" sizes=\"auto, (max-width: 529px) 100vw, 529px\" \/><\/a><\/p>\n<p>Note that S1 is very dominant on the horizontal desk. This is because most of this light is polarized horizontally (s polarized) in this case.<\/p>\n<p>Finally we can process S2: (note ps01 and ps03 were calculated in the previous step for S1).<\/p>\n<p><a href=\"https:\/\/research.ece.ncsu.edu\/osl\/imaging-polarimeter\/image12\/\" rel=\"attachment wp-att-543\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-543\" src=\"https:\/\/research.ece.ncsu.edu\/wp-content\/uploads\/sites\/27\/2017\/09\/Image12.jpg\" alt=\"\" width=\"530\" height=\"729\" srcset=\"https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image12.jpg 530w, https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/Image12-218x300.jpg 218w\" sizes=\"auto, (max-width: 530px) 100vw, 530px\" \/><\/a><\/p>\n<p>You can also do division to calculate the normalized Stokes parameters, but ultimately this will get a bit challenging to do in Iris as opposed to Matlab primarily due to divide by 0 or large pixel values (infinity).<\/p>\n<h2>4.2\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Collect Indoor Validation Data<\/h2>\n<p>Perform the same procedure as in Section 4.1 on a \u201cvalidation scene\u201d. In this case, the simplest validation is to check if your degree of linear polarization is greater than one. The easiest scene to see with a DOLP of 1 is your laptop, TV, or computer screen. This contains a linear polarizer which is usually oriented (approximately) at 45 degrees. For these images, be careful to make sure that the laptop screen is relatively far away so that you can perform image registration on some other part of the scene (say, something behind the laptop screen). If you try to use the laptop screen itself (or the information on it) to perform the image registration step, it won\u2019t work on some frames because the laptop screen may be completely dark! Process the data and verify that the DOLP is approximately equal to 1.<\/p>\n<h2>4.3\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Collect Outdoor Data<\/h2>\n<p>Now you can go outside and collect some images. Interesting objects include cars and parking lots. Pavement also has an interesting polarization signature, as well as water. Be careful about trees, though. One issue with a temporally scanned polarimeter such as this are trees and wind. This can cause issues with image registration (if you try to use a tree as your registration target) or it can create false polarization signatures (the trees will look more polarized than they should be!).<\/p>\n<h2>4.4 \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Color Fusion<\/h2>\n<p>While we won&#8217;t cover it here yet, these results can be depicted in a Red Green and Blue colorspace. In a colorfused image, the grayscale intensity (S0) image is falsely colorized using the polarization information. Specifically, the degree of linear polarization (DOLP = sqrt(S1^2+S2^2)\/S0) is used to establish the saturation, or amount of color, for a given pixel. Meanwhile, the orientation of the linear polarization state is used to establish the hue, or color, of a given pixel. A view of an image from the Hunt Library at NCSU, taken by Brett Pantalone, illustrates this effect.<\/p>\n<p><a href=\"https:\/\/research.ece.ncsu.edu\/osl\/imaging-polarimeter\/huntlibrary\/\" rel=\"attachment wp-att-544\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-544\" src=\"https:\/\/research.ece.ncsu.edu\/wp-content\/uploads\/sites\/27\/2017\/09\/HuntLIbrary.jpg\" alt=\"\" width=\"640\" height=\"360\" srcset=\"https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/HuntLIbrary.jpg 640w, https:\/\/research.ece.ncsu.edu\/osl\/wp-content\/uploads\/sites\/27\/2017\/09\/HuntLIbrary-300x169.jpg 300w\" sizes=\"auto, (max-width: 640px) 100vw, 640px\" \/><\/a><\/p>\n<p>Note that the sky in the background is polarized, as indicated by the presence of highly saturated color. This polarization is due to Rayleigh scattering in the atmopshere, and is what many insects use for navigation. Meanwhile, the window surfaces on the building (smooth dielectric surfaces) are also highly polarized, but have a different color (green). This indicates that it has a similar polarization strength (similar degree of linear polarization) as the sky; however, it is polarized at a different angle. In this case, green is roughly vertically polarized while red is roughly polarized at 135 degrees. Conversely, rough surfaces in the scene, such as the pavement and grass, are generally unpolarized or weakly polarized as indicated by the relative lack of color. Finally, you can also see colorful &#8220;shadows&#8221; of people. This is actually caused by the temporally scanned nature of the polarimeter. Since it takes time to rotate the polarizer into new orientations, movement in the scene (as is generated by people walking past the camera) shows up as a false polarization signature!<\/p>\n<h2>5.0\u00a0\u00a0\u00a0 Appendix A &#8211; Matlab \/ Octave Reduction Matrix Code<\/h2>\n<p>clear all<\/p>\n<p>close all<\/p>\n<p>&nbsp;<\/p>\n<p>%load images<\/p>\n<p>tmp = fitsread(&#8216;I:\\0.Work\\0.Classes\\ECE592-035 &#8211; Polarimetry\\Design Project\\Matlab Code\\Images\\tmp\\s2.fts&#8217;);<\/p>\n<p>Samples(:,:,1) = tmp;<\/p>\n<p>&nbsp;<\/p>\n<p>tmp = fitsread(&#8216;I:\\0.Work\\0.Classes\\ECE592-035 &#8211; Polarimetry\\Design Project\\Matlab Code\\Images\\tmp\\s1.fts&#8217;);<\/p>\n<p>Samples(:,:,2) = tmp;<\/p>\n<p>&nbsp;<\/p>\n<p>tmp = fitsread(&#8216;I:\\0.Work\\0.Classes\\ECE592-035 &#8211; Polarimetry\\Design Project\\Matlab Code\\Images\\tmp\\s3.fts&#8217;);<\/p>\n<p>Samples(:,:,3) = tmp;<\/p>\n<p>&nbsp;<\/p>\n<p>Nx = length(Samples(:,1,1));<\/p>\n<p>Ny = length(Samples(1,:,1));<\/p>\n<p>Ns = length(Samples(1,1,:));<\/p>\n<p>&nbsp;<\/p>\n<p>%%<\/p>\n<p>%now calculate the Stokes parameters assuming that the linear polarizer has<\/p>\n<p>%an extinction ratio of 100 or so.<\/p>\n<p>M1 = lpr(0,1,0.01);<\/p>\n<p>M2 = lpr(45,1,0.01);<\/p>\n<p>M3 = lpr(90,1,0.01);<\/p>\n<p>&nbsp;<\/p>\n<p>W(1,:) = M1(1,:);<\/p>\n<p>W(2,:) = M2(1,:);<\/p>\n<p>W(3,:) = M3(1,:);<\/p>\n<p>&nbsp;<\/p>\n<p>PW = pinv(W);<\/p>\n<p>&nbsp;<\/p>\n<p>%reconstruct the Stokes parameters pixel by pixel, using the registered<\/p>\n<p>%images<\/p>\n<p>Pout = zeros(Nx,Ny,Ns);<\/p>\n<p>Pout(:,:,1) = Samples(:,:,1); %this was the master image from before<\/p>\n<p>for n=1:Nx<\/p>\n<p>n<\/p>\n<p>for m=1:Ny<\/p>\n<p>%create power vector<\/p>\n<p>P = squeeze(Samples(n,m,:));<\/p>\n<p>S(n,m,:) = PW*P;<\/p>\n<p>end<\/p>\n<p>end<\/p>\n<p>%%<\/p>\n<p>S0 = flipud(squeeze(S(:,:,1)));<\/p>\n<p>S1 = flipud(squeeze(S(:,:,2)));<\/p>\n<p>S2 = flipud(squeeze(S(:,:,3)));<\/p>\n<p>&nbsp;<\/p>\n<p>S1norm = S1.\/S0;<\/p>\n<p>S2norm = S2.\/S0;<\/p>\n<p>DOLP = sqrt(S1norm.^2+S2norm.^2);<\/p>\n<p>&nbsp;<\/p>\n<p>figure(1)<\/p>\n<p>imagesc(S1norm)<\/p>\n<p>colormap(gray(256))<\/p>\n<p>caxis([-1 1])<\/p>\n<p>figure(2)<\/p>\n<p>imagesc(S2norm)<\/p>\n<p>colormap(gray(256))<\/p>\n<p>caxis([-1 1])<\/p>\n<p>figure(3)<\/p>\n<p>imagesc(DOLP)<\/p>\n<p>colormap(gray(256))<\/p>\n<p>caxis([0 1])<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p><a href=\"#_ftnref1\" name=\"_ftn1\">[1]<\/a> Care must be taken when using this polarimeter configuration, though, because the polarization state that is incident on the camera is always changing \u2013 this can result in additional measurement error!<\/p>\n<p><a href=\"#_ftnref2\" name=\"_ftn2\">[2]<\/a> For truly accurate data with a digital SLR, you can also acquire images in RAW mode \u2013 but processing steps will differ significantly if you do this. So please for now, use JPG.<\/p>\n<p>This material is based upon work supported by the National Science Foundation under Grant Number (NSF Grant Number 1407885). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>1.0\u00a0\u00a0\u00a0 Introduction The goal of this instrumentation tutorial is to create and calibrate a Stokes imaging polarimeter. This requires a linear polarizer (polymer film will&#8230;<\/p>\n","protected":false},"author":103,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_acf_changed":false,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"class_list":["post-531","page","type-page","status-publish","hentry"],"acf":[],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/research.ece.ncsu.edu\/osl\/wp-json\/wp\/v2\/pages\/531","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/research.ece.ncsu.edu\/osl\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/research.ece.ncsu.edu\/osl\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/research.ece.ncsu.edu\/osl\/wp-json\/wp\/v2\/users\/103"}],"replies":[{"embeddable":true,"href":"https:\/\/research.ece.ncsu.edu\/osl\/wp-json\/wp\/v2\/comments?post=531"}],"version-history":[{"count":5,"href":"https:\/\/research.ece.ncsu.edu\/osl\/wp-json\/wp\/v2\/pages\/531\/revisions"}],"predecessor-version":[{"id":568,"href":"https:\/\/research.ece.ncsu.edu\/osl\/wp-json\/wp\/v2\/pages\/531\/revisions\/568"}],"wp:attachment":[{"href":"https:\/\/research.ece.ncsu.edu\/osl\/wp-json\/wp\/v2\/media?parent=531"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}