Showing posts with label Intelligent Transportation Systems. Show all posts
Showing posts with label Intelligent Transportation Systems. Show all posts

Monday, July 27, 2009

Photo Enforcement Safety Benefits: Separating Fact from Fiction

Speed Camera Photo Enforcement

Photo Enforcement Safety Benefits: Distinguishing Fact from Fiction (Part II)

Evaluation of the Executive Summary of the Report Entitled, Evaluation of the City of Scottsdale's Loop 101 Photo Enforcement Program, Draft Summary.

Updated: 28 JUL 09, 1840EST

In my previous blog post, I laid the groundwork and context for this series of related posts.

In this part, I look at the Executive Summary of the preliminary analysis results of the fixed speed-enforcement camera demonstration program (SEP) that was "conducted" on Arizona State Route 101 (a.k.a. Loop 101) during the period of January 2006 through October 2006 (a duration of approximately 10 months).

The stated purpose of this "evaluation "program was to quantify the following five elements:
  • The impact of the SEP (photo enforcement program) on speeding detections (76mph or faster)
  • The impact of the SEP (photo enforcement program) on average speeds
  • The effect of the SEP (photo enforcement program) on traffic safety (ie; motor vehicle crashes)
  • The expected economic costs and benefits of the SEP (photo enforcement program)
  • The financial public perception impacts of the program
The "evaluation" was administered (but not conducted) by the Arizona Department of Transportation (ADOT) and utilized data from a variety of sources, namely the Arizona Department of Public Safety (for crash report data), ADOT itself (for data, traffic volume, and speed data), the Arizona Crash Outcome Data Evaluation Systems (for crash data and crash costs), the National Highway [Traffic] Safety Administration (NHTSA) and Redflex (for detection rates and traffic speeds).

In essence, the vendor standing to benefit most from a "favorable" outcome of this "evaluation" program, was Redflex itself. Does this sound a bit like asking the proverbial fox to guard the hen house?


OK. Here is where the fun begins.

The evaluation program was to look at four key time periods:
  • The before SEP (photo enforcement program) period
  • The SEP (photo enforcement program) warning period, only warnings issued to drivers
  • The SEP (photo enforcement program) violation period, actual citations issued to drivers
  • The after SEP (photo enforcement program) period, system no longer utilized during evaluation study
The Scottsdale SEP program employed 6 speed cameras on a 6.5 mile section of Loop 101 within the city limits of Scottsdale.

Now keep in mind the primary objective of the SEP program was to assess potential safety and cost benefits (of reduced speeding detections) in terms of an assumed consequence of overall reduction of crash and injury rates.

The very periods likely to have the highest rates of crashes (particularly multi-vehicle accidents or MVAs) or associated injuries are periods of high vehicle density, like those during morning and evening rush-hours, for obvious reasons.

Would you be surprised if I told you that those periods were specifically left out of the speed detection portion of the program?

The reason? Simple.

The report itself stated that during peak-hour traffic, speeds were constrained by congestion, and therefore it was highly unlikely that high speeds (defined as an excess of 11mph over the PSL of 65) were even possible!

For this "reason" speed data were discarded or not measured. In other words, the times when crashes were potentially at their highest rates, speeding could have not be a direct causation factor, but other dynamics were more likely to be, such as excessive traffic density, driver inattentiveness, poor vehicle maintenance, and/or inadequate highway engineering designs.

I find this a remarkable admission.

I suspect had that data been included, the overall "beneficial effects" of "reduced crash rates" as a they were related to speed (ie; the positive effects of the SEP program) would be have been futher marginalized.

For convenience and for the sake of Redflex being able to call their program a "success," speed data from the most dangerous periods were completely ignored! Furthermore, the report stated that empirical data sets were either extremely limited or completely non-existent.

To make up for these deficits of real empirical and concrete (independently generated) data, the "authors" instead relied upon limited data (from another section of the highway and not even in the city limits of Scottsdale!) and the reliance on complex statistical models, instead, the kind that only individuals with Ph.Ds would use.

By using data from an entirely different portion of the highway and with a reliance on "hocus-pocus" math, the underlying integrity of the report's conclusions was severely undermined.

As if this wasn't enough...

Even with the limited data and the massaged statistical analyses, their findings suggested an increase (33%) of rear-end type multi-vehicle (MVA) crash-rates (not at all surprising considering the unsafe dynamics these photo enforcement systems create, such as "traffic porpoising"), though the report also suggested a somewhat paradoxical finding that the related injuries were reduced (by 12.57%).

In reality, one would expect injuries associated with such crash-type (along with offset or head-on collisions) to be of the most serious nature (ie; spinal/neck).

The executive summary also suggested that increases in these rear-end crash types were swapped for decreases in other types (ie; potentially less severe ones). In plain english, this meant the crash reductions did not necessarily occur, but different (ie; more severe) crash types, did.

I find this an even more remarkable admission.

However, the report did indicate that this comparison of the program evaluation periods to the "before program periods" did not consider other factors that could have or would have accounted for the varying accident rates observed or calculated, such as weather or other roadway conditions (like construction zones or lane restrictions).

Even with its limited amount of concrete empirical data, the report basically concluded that the SEP program reduced average speeds by approximately 9.5 mph, during its implementation.

Now that may sound like a lot until one one considers that highway posted limits should be set to the 85th percentile of speed (according to ITE engineering standards).

Therefore, if Loop 101 had a more appropriate posted limit of 70mph or 75mph (again, only obtainable during non-peak times), then these "speed detection reductions" would have be entirely irrelevant.

Stated another way, it appears to me that the SEP program only confirmed that the appropriate PSL (posted speed limit) should be raised in accordance with normal traffic flow rates (and that it remains 5mph to 10mph too low).

The report's executive summary of the SEP report does "come clean" in certain respects:
  • The "results" were based on small and incomplete [data] samples. (ie; insufficient data)
  • The "results" were based upon incomplete time (ie; insufficient time where random fluctuations of crash were common and could have influenced the results substantially)
  • Crash result trends were made at another site and crash data was used from high-peak periods (ie; rush-hour) even though speed detections were ignored for same periods.
Another interesting golden nugget suggested in the executive summary of this preliminary report of the 101 Loop SEP evaluation program was the admission that the highway of Scottsdale 101 (Loop 101) was already statistically safer to drive than other highways throughout the country!

Perhaps, this was the report's most remarkable admission for several reasons.

This acknowledgment, suggests the obvious: that speed, in-and-of-itself, is/was not a major contributor to highway crash or injury rates or has a limited adverse impact on overall highway safety.

Why would a city that was being actively lobbied at the time by Redflex and/or ATS conduct such a test [for/with] these very same companies who were (at the same time) forecasting huge potential profits from 'exploitation' of the marketplace?

Why would the findings (ie; conclusions) from this preliminary report be used to drive legislative policy nearly two years before the final report was to see the light of day?

Answer (to both questions): Follow the money.

How the SEP could be described as an "unqualified" success (by others not financially connected to its findings) would be a bit of a stretch.

How the report concluded (as it did later) that an increase of 33% in rear-end crashes as a result of the SEP was a "negligible" increase and an "equitable" exchange for less-severe accidents, would be a bit more than a bit of stretch.

That's enough analysis for now, because if you are like me, your head may dizzy from all the spin.

Related Reading:
Related Online Discussion:
©2009, all rights reserved. no portion of this article may be reproduced without expressed consent of the author.

Sunday, July 26, 2009

Intelligent Transportation Systems' Safety Benefits: Separating Truth from Falsehood


Privacy and due-process robbing technology saturating the streets of greater Phoenix, Arizona metro area administered by for-profit industry.

Intelligent Transportation Systems' Safety Benefits: Separating Truth from Falsehood (Part I)

Updated: 28 JUL 09, 1845EST

I have long considered covering this extended and complex topic with my readers, but as it has been a challenging and time consuming task, especially when one considers how much reading and fact-checking is required, as such, it has taken longer than I had originally anticipated.

Nonetheless, I will attempt to do this subject matter justice, not just because of the high-degree of misinformation generally surrounding such studies (often by design of the publishers) but also because their objectives are routinely politically and/or economically motivated.


One Huge Benefit of Intelligent Transportation Systems (ie; Photo Enforcement Programs) is Money from your Pocket into Theirs.

My goal is to truthfully answer questions about the supposed safety and economic benefits raised by certain government sanctioned reports of Automated Photo Enforcement Systems (mis-labeled as Intelligent Transportation Systems).

I am going to start this series with a look at one particular "feasibility study" undertaken by RedFlex, the Arizona Department of Transportation, the City of Scottsdale, and the Department of Civil & Environmental Engineering of Arizona State University with Simon Washingtion, Ph.D (and two of his colleagues: Kangwon Shin & Ida Van Shalkwyk).

There are two reports connected with this study (formally called the SEP or Safety Enforcement Program). The first is a preliminary report which then later became the second and final report.

For the purposes of this post, I will focus on the initial report entitled, "Evaluation of the City of Scottsdale Loop 101 Photo Enforcement Demonstration Program: Draft Summary Report" which was originally dated January 11th, 2007, a report that can no longer be found online. Fortunately, we kept a copy as it originally appeared.

To appreciate the context of this preliminary/summary report and for the brief analysis that follows you should know that Scottsdale Arizona has long-served as a photo-enforcement friendly city (as does the entire state of Arizona). For more than a decade the state has served as host to two of the largest private and for-profit photo-enforcement companies, American Traffic Systems (ATS) and foreign-owned Australia-based company, Redflex.

Both of them have a rich and long history of lobbying government legislatures while at the same time forecasting explosive revenue (ie; sales) growth for both red light camera and speed camera enforcement to their investors.

It's also important to understand that for the purposes of ADOT's (Arizona Department of Transportation) and Scottsdale's photo enforcement exercise (SEP evaluation), much of the underlying data used (to drive the reports' conclusions) were provided by Redflex, the vendor financially benefiting from the program study.

For the sake of report integrity the data should have been independently measured and tabulated by those skilled-in-the-art and whom have little or no financial stake in the outcome.

This fundamental lapse of integrity is why at the core of the SEP, the subsequent reports' findings as well those diluted summaries which followed (used to promote such systems elsewhere), the results and conclusions proferred were fatally flawed.

I'll begin discussing/dissecting the accuracy of the preliminary report by first examining it's initial disclaimer (emphasis is mine as is text in yellow.):

The contents of the report reflect the views of the authors who are responsible for the facts and the accuracy of the data presented herein. (This is a false claim for the underlying data was collected and presented by Redflex itself.). The contents do not necessarily reflect the official views or policies of the Arizona Department of Transportation or the Federal Highway Administration. This report does not constitute a standard, specification, or regulation. (Indeed, the "conclusions" of this preliminary report did, in fact, precipitate state-wide legislative and regulatory supporting efforts, not just those of the City of Scottsdale). Trade or manufacturers' names which may appear herein are cited only because they are considered essential to the objectives of the report. (The objectives of Redflex and ATS were and continue to be the successful lobbying of these very same legislative and regulatory bodies for the support of their company's products and services with huge monetary inducements[a.k.a. glorified kickbacks] in the form of what amounts to an unconstitutional tax on motorists) The U.S. Government and the State of Arizona do not endorse products or manufacturers. (This claim is entirely false, as the data provided and the objectives of the report(s) are to increase the use of RedFlex's and/or American Traffic Solutions' products and services).

Just in the preliminary report's initial disclaimer, every statement was factually inaccurate.

Now compare this disclaimer to Redflex's own "statements" in their financial report from the same period.

Redflex outlines a strategic plan to influence the legislative process.

Is it coincidence that the local city which hosts their Redflex's U.S. corporate-headquarters has undertaken such a SEP evaluation study during the same time Redflex is actively lobbying and selling their products and services? Answer: Absolutely Not.

Note top two priorities are directly tied to declining operating budgets and increased deficits (ie; tax-revenue shortfalls) to see how these studies are largely economically and politically driven (ie; there's the kickback to the politicians and the definition of their growing demographic, the U.S. taxpayer).

Unfortunately, the report's distortions/inaccuracies didn't stop there.

In future posts, I will continue dissect more of this preliminary report and the related one that followed well as the derivatives that continue to float around various government agencies and industry trade groups. When we're finished, you'll have a better sense about how safe these photo enforcement systems really are and what really are the underlying motivations behind their deployment, not only in Scottsdale but elsewhere, as well.

Related Reading:
Related Online Discussion:
©2009, all rights reserved. no portion of this article may be reproduced without expressed consent of the author.