Sunday, December 7, 2014

Project 4 - Report Week
Final Presentation - South Plainfield, NJ Food Deserts
GIS 4930

For my Final Project in GIS 4930 I finished out the class with the wrapping up of my project as a whole. This week I was asked to create a PowerPoint Presentation to display the Analysis of Food Deserts in South Plainfield, New Jersey.  I utilized the same concepts and principles we used in the previous weeks to produce not only a static map within QGIS, but also a webmap utilizing Tilemill, Mapbox, and Leaflet.

In addition, I became more familiar with how to add plugins into my leaflet code.  This took a little more time than I anticipated as I had to do quite a bit of research online and working with colleagues to produce the results.  In the end, I was able to add a full screen plugin, along with a Mini Map interface on my webmap.  These new plugins along with the earlier Geo Locator tool plugin allowed the users of my map to work with the map.

As you will see in my results, I found that a large area of South Plainfield, New Jersey is located with a food desert.  This is primarily due to a lack of a quality grocery store on the western side of town. If you look at the map below you can see the area has been identified using a color ramp to highlight the population density affected by this Food Desert phenomenon.  


Please feel free to take a look at my Presentation below.  In it I go into greater detail about the methods used to identify these food deserts as well as highlight what I believe can be done to minimize their existence in the local area.



My webmap can also be accessed at:



Friday, November 21, 2014

Project 4 - Analysis 2
Tilemill and Mapbox
GIS 4930



This week we used what we learned previously in Tilemill and integrated it into Mapbox.  WOW! What a great little program that is completely free!  It holds up to 100MB in layers/space and allows anyone with the know how the ability to produce visually stunning digital maps.

For this map, I was just getting comfortable with the tools.  The real work begins with the Report week.  Here I wanted to just simply show the data that I produce and put it on the map.  I see that the transparency does not come through, and this does not look well when it comes to the choice of basemap that I chose.  I will no doubt need to work on this for my final report.

My Grocery Store data (seen as small red dots on the map) were collected with local knowledge of Grocery Stores in the area, as well as, the help of Google Earth to nudge my memory.  Using Google Earth I was able to identify and isolate my grocery stores into a single group.  With just those locations selected I was able to export the points as a kml extension file.  I brought this file into ArcGIS where I was then able to convert KML into a shapefile.  I utilized the project tool to convert it to a New Jersey State Plane projection. With my tools from ArcGIS done, I then used QGIS to review all data.

My Food Desert locations were identified by first using the 2010 US Census Block data that I obtained from New Jersey Geographic Information Network (NJGIN) https://njgin.state.nj.us.  This location is the New Jersey GIS data warehouse that the Office of GIS for New Jersey utilizes and updates.  Using QGIS I was able to isolate the blocks that fell within the Municipal Boundary (also obtained from NJGIN).  With the Census Blocks isolated and identified I used the polygon centroid tool within QGIS to locate the centroids of the blocks.  Then using QGIS’s Spatial Selection capability I identified the blocks whom centroid was within 1 mile of a Grocery Store.  These blocks were then exported as a Food Oasis layer.  I then reversed the selection and exported the remaining blocks that were then considered Food Deserts. 

I believe the quality and credibility is quite accurate.  The Grocery Stores that I included were those of major chains and were large enough to support several hundred individuals.  Smaller stores like 7-Eleven or Gas Stations were not cataloged, as were very small stores that did not have the square footage to support mass produce.   The Desert Oasis layer made up of Census Blocks are only as good as the data provided.  Being familiar with the area, the population numbers look accurate, and coming from the US Census the data should be of good quality, even though 4 years has passed since it was last collected. 


The data was not very surprising to me for my local area.  The west and south central part of my town is heavy commercial and industrial buildings. The trends I observed here were that Grocery Stores were more prevalent in areas where residential population were higher in the north side of town.  This makes sense since their customers are predominantly shopping for their home and not their business.  What I did notice was subdivisions located near or around these industrial parks were in Food Desert areas.  In fact, some of the higher population census blocks are locate near these industrial parks in the central part of town.   The construction of a single grocery store in the Central section of town could reduce the food desert by approximately 90%.  What does surprise me is the fact that a local retail store has not identified the market potential here and build a store to capture the market.

Thursday, November 13, 2014

Project 4 - Analysis 1
Intro to Tilemill and Leaflet
GIS4930

This week we jumped into learning Tilemill and Leaflet, both free open source software packages.  In Tilemill we learned how to navigate through the program and added layers, modified symbology, and brushed up on my color brewer skills.  All this while utilizing code to do so.  In Leaflet we learned how to reach out and captivate an audience through the power of web mapping.  By scraping some source data, we modified it and made it our own.  In Image 1 you can see the code utilized to identify a point and modify the text that appears on that points popup window.  Here also you can see how a circle was drawn to identify a food oasis area.  Finally in the code you can see the lat/long points utilized to draw a polygon, and the code that made it possible. 

Image 1 - Code utilized to identify/draw a point, circle, and polygon

Image 2 shows how the coding from Image 1 was displayed on the web map.


Image 2 - Webmap of Pensacola Florida depicting the identifiers build into the code.

Later in the same lab, I was able to add more layers to the webmap and make them intractable.  In Image 3 below you can see discgolf locations found around the Pensacola, Florida.  The dialog box in the upper right shows what layers can be turned on and off with a click of the mouse.

Image 3 - Discgolf locations around Pensacola Florida

This weeks lab was a great lab!  I will definitely put to use what I learned this week in project I have planned for the future.  What great tools, and for the right price too!

Sunday, November 9, 2014

Project 4 - QGIS
Prepare week
GIS4930


This week's lab had us begin to utilize an Open Source GIS program called QGIS.  This program does not have all the bells and whistles available in it that ArcGIS desktop has to offer, but it gives the user the basic capabilities of ArcGIS for a price that can't be argued about, FREE.

To begin this lab I first familiarized myself with the inner workings and setup of the layout.  Quickly I discovered that it was similar, with a few minor changes.  Some of these changes were for the better and others just annoyed me.  However the overall application was easy to work with and allowed me to accomplish the task as assigned.

Clipping, Attribute Selection, Mapping Basics, and Analysis tools all were available or integrated into my finished maps below.  I even utilized the Dissolve tool to create a border for my study area in part B.

For the first map, I became familiar with how to do basic map elements and work the Print Composer like an artist would do.
Escambia Florida - University of West Florida - QGIS developed map
The second map, show below, had a basic geoprocessing analysis built into it.  Everything was done withing QGIS with the exception of the NEAR tool, which was run from ArcGIS.  The map shows Food Deserts and Food Oasis locations.  Essentially a food desert is describes as an area that does not have access to quality grocery stores within a mile, while the food oasis is just the opposite.  As you can see a large area of Pensacola Florida is within what could be defined as a Food Desert.


The lab this week was a great look into alternate programs outside of ArcGIS that can help provide quick and cheap map making capabilities.  With a strong background in ArcGIS and Geographic Information Science users can present to the reader the desired results.


Saturday, October 25, 2014

Meth Lab Analysis Week
Project 3 - OLS Analysis
GIS 4930

This week we delved into the world of Statistical Analysis in GIS.  Specifically the Ordinary least squares and geographic weighted regression.  Below is the OLS table and the map showing the results of that OLS analysis.  I followed the lab, and feel that my results fell within the directions given.  As this type of analysis is widely uses, I would like to have a more in depth understanding of it.  Statistical analysis is not a strong point of mine, and I felt I need to understand it better.



The map itself turned out quite well and produced the results I theorized it would.  Meth labs would be more likely to appear inside the urban populated area of Charleston WV.  Rather than show the Standard Residual and the decimal breakdown, I took the time to make it more readable to my viewers by describing where there may be trends in the legend.  I felt showing a digital values limited your audience, as most people would not understand what the Standard Residual values mean.  As this map targets law enforcement and not my GIS peers this only makes sense.  Also, I left the Inset to the Charleston area as this is the focus of most of the meth lab increase likeliness.  


Overall, this project was a bit difficult to understand and move forward on.  I would like to have more time to focus on understanding this type of statistical analysis in a more confined study with limited variables so I can see and understand the significance of the statistical changes.  Although advanced, I did understand the basic principles, but not to the level I feel I should. 

Wednesday, October 15, 2014

Project 3: Statistics - Prepare Week
Meth Labs per Square Mile - Charleston, WV
GIS 4930

For this weeks lab we began the Prepare Week by getting a basemap together that will be utilized in next weeks lab.  I identified the Charleston, West Virginia area as our Study Area for the project.  For this project we are looking into the socio-economic factors for the distribution of methamphetamine labs.  Utilizing the Census Tracts data from the U.S. Census Bureau and merging that data with education and lab locations my hope is to be able to identify areas that may show a greater likelihood of having an illicit meth laboratory.  

Some of the steps to produce my final basemap included calculating multiple percentages.  I utilized Field Calculator to calculate percent population growth, percent white, percent roommates.  Later in the lab we sped up the process of calculating percentages by using a python script.  This script quickly calculated multiple attribute fields and I was tasked with modifying the script to include percentages for age groups 40-49, 50-64, and above 64 years.  I also wrote script to calculate the male to female ratio and percentage of uneducated individuals in the population.

Utilizing a Spatial Join, I was then able to get a total count of meth labs per census tract.  This in turn allowed for me to calculate the number of labs per square mile for any specific census tract.  I used this data to then show a categorized symbology with a color ramp.  This symbology showed areas where there was meth labs and how abundant they were in that specific tract compared to others in the study area.  My final task was to clean up my attribute table, removing unneeded fields and produce a basemap for this weeks project.  


Thursday, October 9, 2014

Project 2 - Report Week
Mountain Top Removal Analysis
GIS 4930


Mountain Top Removal (MTR) was the focus of this project as well as the environmental impact it has on the countryside.  Utilizing Satellite Imagery from 2010, I was able to do a multi-band spectral analysis on those areas in our group study area (Group 3) and determine locations that may have MTR projects ongoing.  I also utilized Digital Elevation Models (DEM) to run analysis and determine locations where streams may be located.  Streams locations are important as these are greatly impacted by MTR projects.  Runoff into these streams could have detrimental impacts on the environment and surrounding wildlife.
Starting off the project we identified what groups we would be in, and those groups were assigned specific study areas.  Utilizing DEM’s we not only created a steam layer, but also identified watershed locations.  Later in the lab, the group was tasked with identifying MTR locations using 2010 reclassified satellite imagery.  This was done in ERDAS Imagine program.  The classified locations were then brought into ArcGIS and the data was converted from a raster to a polygon feature.  With the polygon feature identified noise and interference was removed.  This consisted areas around Roadways, Highways, Streams and Major Rivers, all of which shared a similar bands with MTR locations.  Finally with this removed we isolated those polygon features that were larger than 40 acres.  This left us with the most likely polygons that were MTR locations.
Our group consisted of four members.  Each member played a role in the production of the data.  We had two satellite images to analyze so we assigned 2 users to each.  I took on the responsibility of keeping communication open.  I started many of the discussions where we needed to communicate back and forth to share information.  I also was able to return initial results to the group quickly so I could share my results and give help on issues that may have arisen.  The final MTR features were posted to ArcGIS Online to share the results that the group found.  I added another layer to the map that shows coal fields as identified by the USGS Eastern Energy Resources Science Center.

Mountain Top Removal Map
ArcGIS Online Login Required



Sunday, September 28, 2014

Project 2 - Analyze Week
Landsat & MTR for Skytruth
GIS 4930


This weeks lab had me analyzing Landsat imagery to determine locations that are considered Mountain Top Removal areas (MTR).  The lab had me find a group (group 3) and then use ArcGIS and ERDAS Imagine to classify these locations.  I combined landsat bands which produced a single multi-band image.  From this image I utilized unsupervised classification in ERDAS Imagine.  I classified 50 different spectral values to identify which locations were MTR and nonMTR.  With this analysis done I manually classified and recoded the MTR values in ArcGIS.

Above is a screenshot of the final result.  Of the 50 separate classes I was able to identify 11 of them to be MTR or similar.  This included some urban areas as well as roadways.  These anomalies should be corrected by utilizing buffers in the next lab, but could not be ignored do the the similarities in values to true MTR locations.

Tuesday, September 23, 2014

Project 2 - LiDAR / Hydro Prepare
GIS 4930


This week I prepared for my next project.  Project 2 had us first sign up with a group.  When the lab began I was working with LiDAR, Digital Elevations Models (DEM's), and Hydrology to better understand how LAS datasets are set up in ArcGIS.  I began our lab with understanding how to find LiDAR data from the USGS website, and quickly moved into understanding how the LAS Dataset toolbar operated.  Some of the 3D modeling is really fascinating and I enjoyed working with it.  Ultimately I was shown how to create a raster from a LiDAR image.  Even though I didn't have to do this in the lab, the instructions were provided so that it can be referred to in the future.

The next lab was more focused around our group and processing the hydrology of our study area based on the DEM's provided to the group.  I learned how to create a stream feature based off of the DEM data utilizing hydrology tools within ArcGIS.  There are six basic steps that should be done in order to achieve this: 

1: Run the Fill tool
2: Run the Flow Direction tool
3: Run the Flow Accumulation tool
4: Calculate the 1% of the pixels
5: Run the Con tool
6: Stream to Feature tool

Finally I was able to create a basin by utilizing the basin tool.  The basins and streams can be seen in the above map.  Basins identify a watershed, meaning runnoff within that polygon will make its way to the stream feature identified within the polygon.  

Monday, September 15, 2014

Project 1 - Report Week
GIS 4930

This week was Report Week.  This was the week that we took everything we have done from our Prepare Week and Analyze Week and combined it to produce desired results that could be used in the situation.  As you have read in my previous blogs, a hurricane is heading toward Tampa, Florida.  In the below maps I utilized Network Datasets to generate the results.  In Scenario 2 you will find a map of a Shelter located at Tampa Bay Blvd.  This map is to be utilized by drivers that need to deliver supplies to designated shelters throughout Tampa.  The routing took into consideration high water due to storm surge and redirected the drivers around areas that could be potentially flooded.  The documents were printed in a gray scale so special attention had to be made to colors and limitations on different types of symbology.  The overview map highlights the entire area and the extent boxes define which map the driver should use.

Scenario 2


In Scenario 4, I was asked to generate a map that could be displayed for a television station that showed local shelters. It also displayed, using different colors, which was the quickest shelter to get to based on local roads and delays that may be present at the time of evacuation.  I utilized ArcGIS to create the initial map, which included the roadways, north arrow, and scale bar.  All other work was done in Adobe Illustrator.  The flexibility of Adobe Illustrator allowed me to create special effects and graphics that ArcGIS does not have, or is limited on.  Drop shadows, layering of other images, and better control of text font were all used to produce a polished map for the viewing audience.

Scenario 4


Overall, this week's project was enjoyable and time consuming.  Each map was unique and even though the data was prepared in the previous week still took a lot of time to tweak to optimal viewing.  It's good to see Adobe Illustrator again as the last time I used it was in my Cartography class.  Once I got started, and did a little refresher reading it was like riding a bike and I was able to do the tasks needed to produce the desired results.

Monday, September 8, 2014

Project 1: Analyze Week
GIS 4930


In this weeks lab we created transportation routes for different scenarios.  Utilizing Network Analyst, I was able to create a Network Dataset from the Transportation layer and from that was able to set parameters to restrict or scale travel time for certain routes.  In the above map you can see that several routes were created.  Patient evacuation routes from Tampa General Hospital to both St. Joseph's Hospital and Memorial Hospital are shown.  The other routes displayed are those that will be used for transportation from the U.S. Army National Guard Station (Armory) to the 3 shelters shown on the map.  These routes took flooding into consideration.  Finally, I was able to produce a polygon area that showed what shelter covered what local roadways in town.  This would allow residents to see what shelter is closest to them based on driving time, and not distance.

I modified my map this week from the original.  I didn't want to change too much, but wanted to add things that I felt were not very clear, or needed improvement.  As you can see, I added an inset map to show you where Tampa was located and its surrounding area.  I also modified the symbology and legend to display the important information.

We were first introduced to Routing early on in our GIS course training.  This lab took what we learned early on and improved upon it.

Tuesday, September 2, 2014

Project 1 - Prepare Week
GIS 4930


This week I prepared a basemap (above) that is to be used in my first project for Special Topics.  This week was all about preparing for the future and the scenario to come.  In the scenario Tampa, Florida is preparing for a hurricane that is projected to make landfall in the following week.  For the Prepare lab, I was tasked with creating a basemap that showed a storm surge 6 feet above sea level.  Utilizing a provided Digital Elevation Model (DEM) I was able to reclassify the elevations to be in feet.  On the map I also provided the roadways, however I did some prep work on the roadways to allow them to be Networked and therefore allow routing in the future.  This will aid in evacuation routing, road closures, and supply routing for disaster relief.  I also provided the locations of Shelters, Police Departments, Fire Departments, Hospitals, and the U.S. Army National Guard Station located in the greater Tampa area.

This map will serve as a basemap to do analysis and produce reports and maps that emergency responders, reporters, and residents will need in the days ahead, during, and after the storm.

Overall, this week's lab was provided with greater emphasis in what we have learned so far, and allowed me the freedom to reflect on the skills I learned to produce a map that will provide the information that will be needed in next week's lab.

Monday, August 4, 2014

Final Project
GIS 4048


For my final project I took on the role of a realtor.  I was hired by the Smith Family who are relocating to Bernards Township, New Jersey.  Mr. Smith has a list of specific criteria that he would like met in order to see where in town he can begin his home searching.  I have been asked to identify, and then use a weighted analysis to compare all the criteria and discover an ideal location to search in town for a home.  For this project, I was tasked with developing maps for each criteria, then creating a powerpoint presentation that will allow me to display the results that I discovered.  My results were surprising!  Take a look at the presentation, review the criteria, and view the results.  I think you too, will be impressed by the results.

By using several of the analysis tools I learned during the class, I was able to complete several maps to perform my weighted analysis from.  I am excited to take these new skills into my workplace and use them.  I found that the results that were predicted were slightly different than those that the final analysis produced.  I also feel that I would add a zoning layer into the results to display those areas that are residential.  Overall, the project was really informative and fun to do.  I encourage you to take a look at my presentation and comment on my blog if you have any questions.

Monday, July 14, 2014

Lab 9 - Urban Planning: GIS for Local Government
GIS 4048

In this week's lab we were introduced to Data Driven Pages, also known as Map Books.  The map listed below is one of several pages that were created for a series of extents defined by the index grid.


The processes to complete this map began with the downloading of data from the Marion County website.  I become familiar with the process of using the Marion County Property Appraisers website.  This allowed me to do a parcel search and locate Parcel Number 14580-000-00 and identify the owner and other characteristics about the property.  These details gave me insight into the type of zoning the property fell under as well as other details like how much a dog house cost to construct.  Utilizing the MAP IT functionality I was then able to export all parcels within 1 mile of my targeted parcel.  This gave me a good number of parcels.  Next I created a Site Map and began the process of joining data from the parcels to the CSV file that contained more details about the property, like owner name.  With my data now joined, I exported it out to a new feature class named Parcels_Join.  I then was able to identify the Zuko parcel and select all the properties within 1/4 mile of the location.  I now had a feature that showed all the properties that fell within the 1/4 mile.  Using the Explode Multi-Part Feature tool I exploded these features out, and was able to show all properties as individual attribute tables even if they were connected to another parcel. Using the Identify tool I was then able to go into the quarter mile parcel file and create a "map key" to be used in my report later.

Next came the Data Driven Pages.  I utilized the Grid Index Feature tool found in the ArcToolbox under Cartography Tools > Data Driven Pages to create a Grid at map scale 2400.  I then used the Select by Location tool to select the zoning that intersected the properties within 1/4 mile.  This reduced the zoning to just the isolated area I was working with.  I continued to add data to my main data frame such as Streets and parcels to make my map look more aesthetically pleasing, and added my map essentials.

I worked with the data grid directly to add a scale field to my attribute table.  Making all values equal to 2400.  I played around with the scale and found the best fit for my main data frame ( 105% worked well for me).  I added some dynamic text to the layout so that each page could be updated uniformly.

With the majority of the work completed on my main data frame, I focused in on my Locator map located in the bottom left.  Utilizing Layer Properties and the Definition Query > Page Definition I was able to set up a mask that would allow the page I was currently working with to be highlighted on the Locator Map.  This functionality is really great to work with and provides the reader an exact location that the main data frame is focused on.

With all my map essentials added and the pages ready to be published, I simply exported my map as a pdf.  All of the pages were joined into a single PDF document.

With the map completed, my focus then turned to the Map Key.  Using the Report Builder in  the attribute table (Table Options > Reports > Create Report) I was able to generate a report that showed a Map Key Identification Number, Parcel ID, Owner Name, Address, Zip Code, and Acreage.

I ran into a little difficulty with the requirements for the Report.  In particular the Zoning Code.  I could not locate the zoning code values in the attribute table.  I initially believed this may have been Zip Code, but I can see how a spatial join could merge the zoning to the parcel layer.  This may also prove difficult since some parcels within the area fall in multiple zones.  (see property 13, 14, and 19 above in the map)  After some consultation from my instructor I manually entered in the data to a new field.  This allowed me to produce the report and show the Zoning Code.

In the second scenario of this weeks lab, we put ourselves in the shoes of a GIS analyst looking for an optimal location to build an Extension Office that must meet certain criteria.  First we had to update our records for a set of parcels...we merged the lots/parcels together and then used the Cut Polygons tool and Feature Construction Toolbar to create new parcels.  Updating the details for both.

With the updates completed we calculated out acres on parcels owned by Gulf County and then searched on those properties for values above 20 acres.  With this data in hand we then located vacant land to isolate those area that could be reported for building the new Extension Office.  These locations were presented using the Create Report option found in the Attribute Table Options.

This week's lab had a number of tasks, and tools that needed to be utilized together to produce the desired results.  I believe the maps I produced for Mr Zuko will provide him the information he desired, and the BOCC should be able to finally build that Extension Office in a desirable location.

Participation Assignment
GIS 4048

Note: Click Images to enlarge

After conducting a web search to locate a property appraiser’s office in my area, I discovered Franklin Townships Tax Assessors Office website.  http://www.franklintwpnj.org/.



Q1:      Does your property appraiser offer a web mapping site?  If so, what is the web address?  If not, what is the method in which you may obtain the data?

            The appraiser’s office does not directly link to the web mapping site from their page, but I was able to find one that is available on the Home Page, and it provides key information about the property.  The web address for this site is www.sdlportal.com, but can be accessed by clicking the Citizen Request Portal button on Franklin Townships Home Page.  Users must create an account and then will have permission to log in and search properties.  By conducting a search the user can pull up any address within the town and view information about the property, including Owner information, Location (address, block, lot, and qualifier), Description of Property, and Valuation or assessed value.  The person can also see Permits that are associated to the property with their current status.

Next I looked into discovering a list of recent property sales by month.  In the tax assessors web page I discovered an “Online Tax Assessment Data” link.  This link took me to another page where I was able to populate information to obtain what is referred to as Mod IV (Mod 4) data.


I was directed to a website that allowed me to search Assessment Records for the entire state of New Jersey.  This can be found at: http://tax1.co.monmouth.nj.us/cgi-bin/prc6.cgi?menu=index&ms_user=glou&passwd=data&district=1808&mode=11


After conducting my search, I found that the records are only good up to May 8, 2014. 

Q2:    What was the selling price of this property?  What was the previous selling price of this property (if applicable)?  Take a screen shot of the description provided to include with this answer.

            Since I am unable to look up the June month, I used the last full month that was covered by the spreadsheet that I downloaded in CSV format from the assessor’s office.  In this case April has a full month of sales.  I was able to identify the highest selling property that sold in the month of April.  1 Heller Park Lane has a sales date of 4/7/2014 and sold for $13,500,000.  The previous selling price was not listed on the data provided.


Q3:      What is the assessed land value?  Based on land record data, is the assessed land value higher or lower than the last sale price?  Include a screen shot.

The assessed land value for the property I identified previous (1 Heller Park Lane) is set at $4,679,000.  The land value is lower than the last sale price.  I also noticed that the combination of the land and building assessment still fell below the sales price.



Q4:      Share additional information about this piece of land that you find interesting.  Many times, a link to the deed will be available providing more insight to the sale.
From my research I was able to find out a lot more about the property.  I was able to determine the property size (26.79 Acres), owner information (Matrix Somerset I LLC), Property Zoning it fell within (M1), and the current block/lot (514/8.03).  Listed also in the data was the assessed value over a number of years.  Using the property location I went to the Franklin Township SDL Portal and also did a search based on the block and lot.  I was shown a lot of detailed data, including the current Permits that are open for the property.  Some of these improvements could affect the assessed value in the future.




 For the second part of our Participation Assignment I took on the role of working for the Property Appraiser's Office.  They asked me to review the land value assessments in West Ridge Place subdivision and tasked with creating a map that will help show inconsistencies in the current assessments.  Below is the map that was created.


Q5:      Which accounts do you think need review based on land value and what             you've learned about assessment?  Please answer this question within your blog post.

Excluding the properties that are odd-shaped with low values, I am left to look at all the properties as a whole.  I believe that the majority of the properties fall within the $27,075 range for property assessment.  This being said, I feel that all the properties should be assessed yearly as this was the trend in the table.  I would want to look for a trend in property value, as such I would look at property values from previous years.  By looking at all the years assessments, I can then see what properties are trending higher or lower in value.  Perhaps some properties had improvements done?  Perhaps others are priced differently due to the condition of the homes?  Do the Easements Restrictions reduce the property value or improve it?  These all weigh in on the land value.  I would expect a request for reappraisal from those properties above the average.  Those that fall below the average may not seek a reassessment to avoid property taxes until it's time to sell the home.  In this case I would want to make sure that the lower assessed property is looked at, and a determination made as to why it does not meet the average.  Therefore, my focus would be on the Yellow and light Orange properties, with the expectation that the red property may ask to be reassessed as well.
E
E






Monday, July 7, 2014

Lab 8 - Urban Planning: Location Decisions
GIS 4048

Home buying, it's one of those things that the majority of us will do in our lifetime, possibly more than once. In this week's lab we looked into determining where to buy when you have specific criteria that must be met. This week we put on our Realtor hats and helped a couple find a home in Alachua County, Florida.

Initially we were told that the couple would like to stay close to their worksites, and would also like to be among individuals that were homeowners as well as between the ages of 40-49 years.  My first map took all the variables and outlined them so the couple could see them individually.

After initially setting up my environments and bringing in the relevant data one of my first tasks was to create a basemap that could be used in all the images.  The basemap provided me the details that the couple would need, but also helped in the map design process.  Next came the proximity analysis using the Euclidean Distance Tool to provide distances outward from the couple's places of employment and the conversion of these results to a raster feature.  Eventually I reclassified my distances to 5000 meters to allow a uniform look to the ring distribution.  Next came the analysis of the Age criteria.  To accomplish this, a new field was created within the attribute table and the percentage of individuals 40-49 was calculated based off of the 2010 Census data.  With this new data in hand, a graduated color value was assigned to the census tracts based off of the new percentage value.  I utilized the same color ramp for all maps allowing me to symbolize all results uniformly.  For the Homeownership analysis, I did the same steps as I did for the age analysis. With all my analyses completed, I produced the following map.

I found it very beneficial to use the same color ramp/schema for all the layers.  This allowed for one easy to read legend item that gave the couple an easy method to read the map.  Dark Green = Good area too look for criteria  /  Dark Red = Not a good area to look for specific criteria.  All the shades in between give them an easy way to determine if they want to make sacrifices in one criteria to obtain more in another criteria.



The next map was created with the original intent to show the couple optimal areas for them to search for a new house.  A modified data frame was added after a determination that the couple wanted a more in depth analysis on travel distance.  It seems they wanted more emphasis put onto travel time, and distance from their job sites.  With this in mind, I began my analysis on all the criteria, and what areas would be ideal both for an equal weight, and a slightly modified weight.

Since this analysis was going to be performed often, the option to use Model Builder was made.  By using Model Builder, I could run the same tool, over and over again, with simple modifications to the parameters.  In this case we used the Weighted Overlay tool, and I was modifying the Influence percentage and assigning restrictions on scaled values to narrow my search even more.

In the first data frame for Equal Weight, I used no restrictions and set equal influence (25%) on all four criteria.  In the second data frame (Modified Weight) I really narrowed down my search.  I did this by first giving the distances 35% influence and only 15% influence to the Home Ownership and Age criteria.  I went a little further here, too.  First I restricted all values down to 15,000 meters by restricting all values except for the 3 closest rings.  This eliminated all the areas outside of 15,000 meters based on both worksite locations. I then said to myself, if I am going to search the closest areas to our jobsites, I also want to see only the top percentages of Homeownership and Age criteria.  I did this by allowing the top 5 percentages to be valued, but restricted all the other values.  This left me with the results being really close to the couple's worksites, and also focused in on the other criteria enough that they could get the best areas of selection even though they were limited on distance.

The results I came up with on both searches lead me to the conclusion that the 3 areas I highlighted on the map would be ideal for my customers.  I added references to the maps so they could easily find their way around, and kept the technical details as brief as possible.  

As someone who recently went through buying a home, I know that I just want to be pointed to the right place by my Realtor, I can then focus my search and make a determination when I see the location and the area around it. 


Tuesday, July 1, 2014

Lab 7 - MEDS Protect
GIS 4048

In this weeks lab I took what I prepared in Lab 6 and put it to use in the Protection of the Boston Marathon. The first objective for my Critical Infrastructure and Buffer Zones map shown below, was to isolate a 3 mile security zone around the Boston Marathon finish line.  With this event buffer identified I needed to isolate the 10 closest hospitals or medical centers in near proximity to the finish line.  I accomplished this by utilizing the near tool in ArcGIS.  I created a five hundred foot security perimeter around these locations and the finish line using the Buffer tool.  This was done to mitigate any threats should an event occur.  Security Checkpoints needed to be set up at the perimeter of the finish line buffer.  I was able to identify locations for these security checkpoints by using the intersect tool after identifying what local roads intersected the 500 foot buffer zone.  These locations were added to my inset map as a point feature class.  With all the data I utilized the Military Portrait Layout to populate key information needed by the readers and verified my map essentials.



The second map went into the surveillance of the finish line and the surrounding areas.  To do this I used a multitude of tools to identify optimal locations and elevations for cameras around the Boston Marathon finish line.  Identifying these locations and understanding what could be seen from the vantage point allowed security to know where best to place security assets.  In this map I utilized LiDAR (Light Detection And Ranging) while exploring the LAS toolbar.  The LAS toolbar allowed me the capability to convert an LAS dataset to Raster.  Exploring and preparing my elevation data was critical for the next steps in the process.  I generated a Hillshade surface layer using the Raster I created from the LAS toolbar. This allowed me to identify areas of shade for a specific day and time utilizing the Altitude and Azimuth settings.  Adding in the Orthoimagery, the shading took on a new life as the buildings cast their shadows.  It was like stepping back in time at 2:30pm that afternoon and seeing where the shadows fell across the landscape.

Next came the selection of security points that I felt would provide optimal observation areas.  I selected 15 points and proceeded to generate a Viewshed of the area based off of those points. My initial findings were that all of my viewpoints were in good spots as I felt that the green shading meant I could clearly see the finish line. However this was soon to be disproved.  I created line of sight lines from my cameras to the finish line and quickly discovered that my points were blocked by other obstructions along the path.  I had to go back and update the elevations on most of my camera locations.  Several meters were added to the elevation and the viewshed was recalculated until I found a good height for each camera giving me as little obscurement to the finish line as possible.

I utilized the the Create Profile Graph tool in the 3D analyst toolbar to generate a graph of Camera 4 showing its location and what distances could be obscured from view.

Next I utilized ArcScene to create a 3D model of the area in and around the finish line.  This by far was the best part of the lab.  Manipulating the model I could fly through the landscape.  Laying the Ortho layer on top of the elevation layer gave me a sense of dimension while working with the layers.  I copied and pasted over the Line of Sight lines and exported the layer so as to be added into my layout.

I finished the map with a custom layout, identifying all map essentials and adding the information that was needed.


Although very time consuming, I felt this lab was a great!  I was able to reinforce skills I already had and was able to utilize new tools to produce the results that were desired on my deliverable map.  I believe next week we move on to Local Government...can't wait!

Monday, June 23, 2014

Lab 06 - MEDS Prepare
GIS 4048


The Minimum Essential Data Set (MEDS) was born from Homeland Security Strategy Objectives as compiled by the White House on July 16, 2002.  These objectives were passed down to the Department of Homeland Security as Presidential Directive 8 which is designed: "to achieve and sustain risk-based target levels of capability to prevent, protect against, respond to, and recover from major events, and to minimize their impact on lives, property, and the economy through systematic and prioritized efforts by Federal, State, local, and Tribal entities, their private and nongovernmental partners, and the general public."  The MEDS is made up of data from multiple internet sources. The minimal data for a MEDS dataset may include, Shapefiles, Raster Dataset, Tables, and GeoDatabases that will assist in times of emergency.

The data within MEDS is comprised of shapefiles from a variety of useful datasets which are grouped together based on their types.  The following is a list of some of the recommended datasets that are included in the MEDS.  

Orthoimagery provides aerial imagery of the area, and requires a higher resolution.  A resolution of 1 foot is typical for this Raster Dataset.  This imagery assists responders in giving them a bird's eye view of the area.

Elevation is typically provided by a Digital Elevation Model (DEM).  These DEMS can be obtained from the USGS or if a larger area is needed NMSS can provide DEM datatsets from the National Elevation Dataset (NED). 

Hydrography can be obtained nationally from the USGS and U.S. Environmental Protection Agency (EPA) from their National Hydrography Database, which is also a component of the National Map.

Transportation data can be obtained from the National Map, and can also be updated from Federal, State, and local entities.  The transportation layer helps responders with egress planning and identifies what roads are considered Primary, Secondary or Local in nature.

Boundaries are typically identified and obtained from the National Map.  As locations become more defined a more accurate projection can be used to make the data more precise as State Plane projections can be used. 

Structures, although not obtained in our prepare section, could be obtained from local and state government and can be seen utilizing the aerial imagery to assist in creating a new layer if needed.

Land Cover can be obtained from the National Land Cover Database of 2006.  This detailed database provides users 30 meter cell resolution and 16 different classes of land cover.  This information is important to homeland security planning and operations, giving the readers a "lay of the land" and helps mitigate the impacts of a catastrophic event.

Geographic Names or Geographic Names Information System (GNIS) was developed by the USGS in cooperation with the U.S. Board on Geographic Names, and contains information about physical and cultural geographic features in the U.S. and associated areas.  These records contain both current and historical data.


In this week's lab we prepared the MEDS geodatabase for use in the future.  Having information up to date and readily available is key when dealing with MEDS as you never know when a disaster may strike and the data will be needed.  Although some of the data was provided to us from UWF, it's important to know where to go to keep this information up to date, and to obtain it for myself in the future.

For this lab we utilized a variety of techniques we have learned in the past to massage the data and make it display the information that is key for the study area, in this case the greater Boston, Massachusetts area. Our first objective was to set up the environment that we would use.  To do this we created multiple Group Layers to help organize the data we would be using.  In the Transportation Group, I categorized the feature class based off of the CFCC ranges that identified Primary, Secondary, and Local Roads in the area. This was accomplished using spatial join and grouping the roadways based off of the CFCC code.  With the new road layers identified and created independently, I next adjusted symbology and scale ranges so that the data was easier to see on the map and clear when looking at different scales.   For Hydrography, we worked with adding data to Group Layers, but made no major modifications to the data as it was presented.  Land Cover was modified slightly.  I used the Extract by Mask tool from Spatial Analysis to extract the Classifications found within the Boston Study Area.  Then I imported a Color Map to identify those classifications and assigned them a uniform symbology.  After applying the new labels to the features, I saved my copy of the layer.  Orthoimagery and Elevation layers, like Hydrography, were added to their corresponding Group Layer with no major changes needed.  The final group, Geographic Names, was next and allowed me to work with manipulating formats and schema to prepare data for import.  Here we had to modify the schema.ini file to make it a pipe delimited format.  This allowed me the capability to import the XY data from the table and create a point feature class of locations and points of interest.  This data covered a wide area that we needed to narrow down in our study area.  To do this, I first used an attribute search to identify the features that fell within the counties of interest.  I then narrowed my results further by taking those features that I selected and identified which were completely within the study area boundaries.  The remaining features were saved and added to the Geographic Names group layer.  With this completed, I was able to modify the labels and symbology to appear aesthetically pleasing when viewed at different scale ranges.  My final task was to make it easier to assign these same settings in the future.  To do this, I saved each group layer as a layer (.lyr) file so I could bring up this data in the future and not have to rebuild all the data, label scales, or symbology.

Although this lab seemed short, I believe what we learned this week is vital to being prepared to handle emergencies in the future.  It gave me a good understanding of key essential layers, where to find them, and how to prepare the data for use in a specific area.  As we never know when or where a disaster might strike, it's good to know that the data is available, and with some help, ready for immediate use.

Thursday, June 19, 2014

Lab 5 - DC Crime Mapping
GIS 4048

For this week's lab we took a bite out of crime in the Washington, D.C. area.  My first task was establishing a quality workspace by modifying the environment settings.  Once this was accomplished I saved the map twice in order to preserve my settings for the second map.

In the above map, I utilized a Microsoft Excel Spreadsheet and the Display XY data tool to identify and display multiple crimes that were committed in Washington, D.C. on January 2011.  The analysis conducted was to locate where a new proposed police substation should be.  Using the street addresses provided from a spreadsheet that contained a list of all Washington, D.C. Police Stations, I was able to Geocode their precise locations using a Locator Tool and the Roadway layer provided by Tele Atlas.  Then I was able to mark those locations with unique symbology on the map.  Using a multi ring buffer tool I was able to determine how many crimes were committed within .5, 1, and 2 miles of a police station.  Those findings are in the map.  I was also able to use a spatial join to tie each crime to the closest police station.  With this link I was able to calculate a percentage that each police station covers, compared to the other departments.  With the analysis completed I identified a location on the map that I felt would be the best location for an additional Police Substation.  The location of the substation is ideal to fight the two highest crimes in the city, Theft and Theft from Auto.


The second map that I created was focused on specific crimes.  The Kernel Density tool was used to display these crimes in a "Hot Spot" style of mapping.  Areas where the specific crime occurred in greater frequency showed up on the map in a darker shade.  These areas were compared to the population density using the Swipe tool, and the results were somewhat surprising.  Homicides occurred in low to medium population density areas.  Sex Abuse was more predominant in medium population density areas.  Burglaries, although wide spread throughout the city, were more concentrated on heavily populated areas.  

Overall the project was a great learning experience.  I had some trouble with selecting the proper size of my maps so that there was not too much unused space.  I believe in the end I have a well balanced map and clearly provide the readers with the information I want to present.




Tuesday, June 10, 2014

Lab 4 - Hurricanes
GIS 4048
In this week's lab we continued with Natural Hazards with the emphasis being on hurricanes and their destructive power.  To begin the lab I was asked to generate the path of Super Storm Sandy which struck the East Coast on October 29, 2012 just north of Atlantic City, New Jersey.  The map points were generated using xy data from a pre-existing spreadsheet.  When the 31 points were generated I was able to plot the path of the storm utilizing the Point to Line tool in ArcToolbox.  I also created a unique symbol to represent the storm's symbology.  Color coding this symbology allowed me the ability to show the intensity of the storm along its track.  Each symbol was then labeled to identify both the wind speed and barometric pressure.  The last thing to do was to add my graticules to identify the longitude and latitude, as well as the basic map essentials.  With this done I was able to produce the map shown below.

The second deliverable in this lab is a Before and After map of a location devastated by Hurricane Sandy's landfall, Toms River, New Jersey.  This part of the lab struck home to me as I am a New Jersey resident.  The towns and shoreline impacted by the storm are locations I frequent often. 

To begin this lab I first consolidated the data I needed into a geodatabase that would be used to work with the data directly.  I created a Mosaic of Before and After images of the area affected by Sandy.  Then I added a new Feature Dataset to the Geodatabase that contained New Jersey Counties, Municipalities, State and Road feature classes.  Adding the Study area I was able to identify the area that I would be studying for structural damage.  With the creation of a new point feature class I was able to add a domain that would allow anyone adding points to this feature class a limited selection on coding the attribute data.  Using effect tools such as swipe and flicker, I was then able to compare the before and after aerial images to determine how structures were affected.  I utilized the parcel layer to identify where a marker should be placed, and  I rated how bad the Structural Damage was, as well as the Wind Damage sustained.  I also was able to identify the property type and if the house was inundated.  All these attributes were added to the Structure Damage feature class.  With all the data gathered I then symbolized the damage and prepared a table showing the results.

I identified 9th Street as my primary focus, and began to develop my map with the data provided.  I added the basic map elements and produced the following map.


As I stated above, this lab and the Storm affected me personally.  A week prior to the storm I was at a MACUrisa conference with several GIS Professionals and ESRI representatives in Atlantic City.  At the time, the storm was still in the Caribbean and I sat with one of the top Federal USGS GIS staff in the Northeast.  I asked him directly, "What do you think about this storm in the Caribbean?  You think it will hit here?"  He smiled and said, "No, tho
se storms always come up here and make a right turn.  There's no way it will turn left."  A week later, it turned left.  Hopefully what we learned here in New Jersey will mitigate the hazards we face in the future.





Saturday, May 31, 2014

Lab 3 - Tsunami's
GIS 4048
 
 
For this weeks lab we continued with Natural Hazards and creating maps to help mitigate the impact of such hazards.  The lab was two-fold this week.  First we outlined areas that would be affected by radiation due to the Fukushima Power Plant disaster.  Secondly we outlined Evacuation Zones for those areas affected by the Tsunami. 
 
This lab had me using several different tools to return desired results.  In particular I utilized the Multi-Ring Buffer Tool to outline those areas up to 50 miles away from the Fukushima Power Plant radiation leak.  Using select by location I was able to identify populations that would be affected by the leak.  Risk assessment could then be done to highlight those that would be directly impacted by this threat.
 
In the Tsunami portion of the lab, we revisited the CON tool, and worked with Raster DEM's to analyze those areas that would be directly impacted by the Tsunami event.  Those areas were then broken down into Evacuation Zones based on the height of the water.  We were also able to utilize the Evacuation Zones to highlight those cities, roads, and nuclear power facilities that would be directly impacted by the tsunami event.
 
Overall I really enjoyed learning these new methods of analyzing different types of disasters and creating a map that will server to mitigate any risk involved.