Composting Diversion

I haven’t written about my Green Cone in a while.

You want to know why? Because like most things that simply work, it is almost completely unremarkable.

I started blogging here a couple of years ago, not too long before I first got a Green Cone. I rambled on after that about installing it, initial concerns, and finally finding the groove.

I only bring it up again because last week the annual Solid Waste Report for the City was reported to Council, and according to the numbers for 2011, diversion rates are up. Way up. It occurred to me that I contributed 0% to that increased diversion. In fact, I might have been responsible for a small reduction of the overall number.

You see, diversion is the measure of how much curbside trash goes to recycling as opposed to the landfill (or incinerator, more on this later). The City’s diversion rate has gone from 31% to 59% between 2009 and 2011, almost completely because much of what used to go to landfill now goes into the Green bins, and is “diverted” from landfilling. This is a remarkable number, 59%, proof that a lot of people are using the green bins. the best part is, although we are currently not seeing the savings (we are managing the costs related to the shift in systems), we are going to save a lot of tax dollars in the long run.

That said, I really don’t use my Green Bin. Since before they arrived, I have been putting my vegetative compostables into a compost bin and using it for fertilizer on my garden, and have taken the nastier organics in my trash (essentially, anything that stinks) and put it in my green cone. Add this to blue bin box recycling, and we really don’t throw much out. We take our 120L trash bin to the curb less than once a month (Really, its is all about the laziness… I hate getting up to take out trash in the morning). in the green bin? A few twigs from annual pruning, and a few gardening weeds I don’t want in my compost.

However, much of that actual “diversion” I do at home does not show up in the City’s or Metro Vancouver’s statistics, because none of that waste ever gets to the curb. Actually, since getting a Green Cone, our measured diversion rates as a percentage of our trash has likely gone down somewhat. So we are saving even more tax dollars.

As for the new blue bins for “commingled recycling”, I have railed on about these in the past. The thing is freaking huge, and I just can’t get over the idea that they are somehow going to make people recycle more because they now don’t have a separate bag for paper. We are going to have to wait until next year’s report to see if this proves true. I am hoping that next year’s report will also include some discussion of where the commingled recycling material is going: how much of it is recycled, how much is “bypass”, and how much is incinerated.

If it is going they way I suspect it is, and the way it has commonly gone in other jurisdictions where commingling has replaced separated containers and paper, then the best strategy for those who actually want to reduce their solid waste generation will be to take paper to the recycling yard, where we can be more confident it is being recycled and not burned.

Oh, the Green Cone? It works great. I have a little counter-top bin to collect things that I don’t want to go in the compost (bones, meat, fish, cheese, gravy, etc) and every couple of days I dump them in the cone. I have not had to add any of the bacterial starter for more than a year, everything down there looks appropriately grey and fuzzy, and there is no noticeable smell. The cone seems to be at some sort of equilibrium where I keep adding stuff and the pile stays the same size… Must have something to do with Dark Matter or other mysterious forces… The worst thing that happened this year is my little counter-top bin started to smell a bit. Turns out something funky got into the fabric fly filter in the top. I popped it out, soaked it a few hours in baking soda, then threw it in the dishwasher. Good as new.

Trolling a Bridge

OK, I’m back. EMAofBC Workshop event went well. Thanks for asking!

Back to regular de-programming…

Hector Bremner has been all over the local media recently. I have met him briefly a few times, and he seems like a nice guy. We disagree on a few things (most notably, our differing opinions on the leadership qualities of Premier McSparkles™), but ultimately, I think his heart is in the right place.

I did like reading his recent comments in the NewsLeader regarding the Pattullo Bridge proposals that have been snuck out to a few “stakeholders” for comment before the great unwashed get to opine.

It sounds like the 6-lane option for the bridge has been favoured, without real justification for expanding capacity at this time, nor for managing how a 50% increase in traffic capacity at one point in our City is going to help a City already congested with too much through-traffic. Setting aside two lanes for trucks sounds useful for “goods movement”, but in reality, you are just removing the trucks from the other two lanes, and making more room for cars. Then the increased number of trucks and cars are going to have to jam together again at the now-worse choke points on the north side of the river.

Make no mistake: I do support dedicating one lane each way on the bridge to goods movement, but not an additional lane. To me, goods movement is a bigger priority than commuters who have other choices. Perhaps a creative compromise solution would be to build a 4-lane bridge, and dedicate two lanes to trucks only during off-peak hours. This creates an incentive for trucks to use the roads when they are underutilized, allowing more efficient trucking, or they can choose to mix it up with commuters at peaks. Admittedly, I just pulled this idea out of my ashtray… there might be unforeseen consequences…

It is implied in the NewsLeader article that a 6-lane bridge will be “safer”, and here is where we see the creative messaging starting to appear. The danger on the Pattullo has always been excessive speed, usually late at night when the bridge was underutilized. I’m not sure how turning it into a wide, mostly empty, 6-lane speedway after midnight will improve safety. The greatest thing ever done to improve safety on the Pattullo was to reduce the number of lanes at night. Somehow, going the opposite way will now help? A 4-lane Pattullo can be as safe as a 6-lane Pattullo, and no Pattullo may be the safest option of all. So much for the safety argument.

The second part of Hector’s discussions I don’t quite follow. I agree that we should not be making major investments into infrastructure until we have a longer-term plan based on identified goals, however he (or the editing) makes it sound like those sorts of plans don’t already exist, when clearly they do.

The broader regional transportation plan is called Transport 2040. The City has an existing Master Transportation Plan, upon which the next one will be built. There is also a Regional Growth Strategy , and his own favorite BC Liberal Government has a Climate Action Plan. All of these say the same thing: we need to stop building space for cars, because that will not solve any of the problems we are trying to solve, and start making it easier to take the alternatives: public transit or active transportation. The plans are there, were created locally and regionally, with political blessing and public input, they are all their laid out on paper. All we have to do now is build the infrastructure we were planning to build, not the infrastructure that is momentarily convenient to build. Because no-one likes to see infrastructure money wasted on making a problem worse (cough*cough*Queensborough bridge*cough).

I definitely don’t agree with the criticism that is implied about the Public Consultations that TransLink ran for the UBE. I think (after the false start forever known as the Donnybrook Conference) they did an excellent job of making themselves available to the public, of balancing the talking and the listening. I think they gave a solid effort to make something work that addressed the local concerns. When they could not come up with that solution through consultation, they “put their pencils down”, which was the honest thing to do. It would have been easy for them, with the political pressure on them from several fronts, to railroad that overpass through (pun intended), but they didn’t. They were true to their consultation model, and TransLink deserve praise for that, not criticism.

Anyway, it was good to see Hector, (who is currently the closest thing we have to a BC Liberal Insider in New Westminster, yet he seems enigmatically logical), critical of how this road expansion will impact New Westminster. What I haven’t heard from him, or anyone else, is how our current traffic system will be negatively affected when tolls begin on the Port Mann and the existing Pattullo becomes the preferred route from Central Surrey. The party he wants to represent in New Westminster has made it pretty clear that they will not tolerate tolls on this bridge, existing or replacement.

That raises another unadressed question: where is this $1 Billion going to come from? If not tolls, then where? Translink is nearly destitute, and even the Pattullo Expansion’s biggest fan, the Mayor of Surrey, is clamouring for whatever spare cash they have lying around to be spent improving public transit in her under-serviced City. The Fed largesse is clearly running out. That pretty much leaves the BC Government, who at this point is happy to muck about unaccountably in TransLink finances, but wash their hands of the negative economics of that mucking about.

If this bridge is in immediate peril of collapse, then let’s plan for the most affordable, practical replacement that fits our needs: a 4-lane structure using the same major intersection architecture at each end. The net impact on the neighbouring communities will be minimized, and we can save $300 Million or so just by building a lean, mean 4-lane bridge. Let’s re-invest those savings in the modern, practical and efficient infrastructure that South-of-the-Fraser needs to reduce their dependence on the Pattullo and reduce the traffic load on New Westminster; just like Transport 2040, the existing New Westminster Master Transportation Plan, the Regional Growth Strategy, and the other plans we have been making for 20 years say we should.

Alternately, let’s actually get ahead of those plans and reach faster for those goals they outline: take the entire $1 Billion and build sustainable infrastructure South-of-the-Fraser, and begin the orderly phasing out of the Pattullo.

Where am I?

Once again, lousy excuses for not posting more.

I have opinions galore, on New West Council taking a hold-on-just-a-minute position on the Pattullo Bridge (good), on the NWEP chiming in about the Pattullo (excellent), on the Peter Kent’s continued asshattery (no longer shocking), on how some New West Rabble were perilously close to starting a grassroots “Friends of Jen Arbo” campaign, just for the fun of it  (and might just yet)… and other things.

But I am up to my eyeballs with this event, which I am helping organize along with being one of the speakers. I am pretty excited to be hearing Paul Anderson’s Talk. There has been so much said about Northen Gateway, It will be interesting to hear about the science of the Envrionmental Assessment.

So everything else is on hold. Talk to you after the 16th.

Meanwhile, for your entertainment, here a buddy and I are checking out the Mayor Rob Ford’s new Public Transit System proposal for Toronto:

Roomy.

MTP Open House 1.20

If you missed the first open house for the Master Transportation Plan, or would like to arm yourself before the St. Valentines Day Messaging, the presentation materials are available on-line at the City Web Site, which provides me the opportunity to go on about some of the details therein. I am completely copying these graphics here without permission form the City, so I suggest only New Wesmtinster taxpayers, who paid for them, should click on them to see full-size versions.

The fifth slide (above) provides the first really interesting data. The bar chart shows Metro Vancouver municipalities ranked by “mode share”. That is the proportion of “trips” taken by modes of transportation other than driving a car. It is pretty clear from this bar chart that New Westminster is already a regional leader in sustainable transportation, with a mode share of 36%, second only to those dirty bike-lane hugging hippies in the City of Vancouver.

What I take out of that chart, though, is a demonstration that infrastructure matters. The top three Cities are those with the complete integration of the SkyTrain. The next two are also well serviced with transit, with the SeaBus an important part of their infrastructure. All of the top 5 have put serious investment into bike lanes and pedestrian amenities. They have all put an emphasis on building compact transit and pedestrian friendly town centers.

Now scan down to the bottom of that list, and see lowly old Langley City. Aside from its abysmal transportation infrastructure and complete commitment to automobiles, Langley City has no excuse for this. It is compact with relatively high population density (2,500 per sq. km., compared to New Westminster’s 3,700 and Burnaby’s 2,200, which are all way higher than North Vancouver District or Langley Township, which are both under 500).  It is relatively flat with a good balance of industrial, commercial and residential land. This 9% is no surprise, however, to anyone who has tried to traverse Langley City. No cycling facilities at all, disconnected and disregarded sidewalks, a half mile of parking lots separating every business. About the only infrastructure it has is a Bypass Highway and the Big Box Hell that is 200th Street. As a cycling buddy of mine once said, reading a road sign: “Langley Bypass. I don’t care where you are going, that sounds like the right idea.”

What’s worse: the Mayor of Langley City, Peter Fassbender, is the vice-chair of TansLink’s Mayors Council. I’m not sure if the 9% should be more embarrassing for him, or if TransLink should be more embarrassed that he has been elevated to help lead the region in Transportation when his own City serves as cautionary example for the whole region… but as usual, I digress.

The walking numbers are not surprising. New Westminster is a great City for walking, and the hills are less of a barrier than some people (Such as the Downtown BIA) suggest, as long as the sidewalk infrastructure is there to make the walking environment pleasant. The map shows that Connaught Heights and Queensborough lag behind as far as walking infrastructure go. The “missing link” between Point Royal and The Quayside really stands out though…

The bicycling infrastructure info is actually a little disappointing. The cycling mode numbers are lower than I would expect, and the cycling network is distressingly incomplete, 13 years after the cycling plan for the City was laid out. Even the designated bike routes we have are either suffering from erosion from disregard (BC Parkway anyone?), or are bike routes in name only, being just as unfriendly for cyclists and completely unmarked as they were 20 years ago (try to cross 20th street on the London Bikeway, I dare you).

I love the topography map they have on the 10th slide. Whoever thought of that map must be a genius. It shows that most of New Westminster is not really that hilly. There are only a few areas where the slopes cause a real challenge to recreational or casual cyclists – the Downtown-Uptown link, the Brow of the Hill, and Sapperton. I hope with this Master Transportation Plan, we can re-draw the cycling map with this in mind, and find creative routes to make these slopes manageable for more cyclists. The City’s Bicycle and Pedestrian Advisory Committee already identified a route from New Westminster Station to the Saint Mary’s site that connects relatively gentle slopes with very short stretches of higher grades, with only a few missing links that need to be connected. The advantage of the current low mode share is that it should be easy to increase in the foreseeable future.

Transit is a good news story for New Westminster, and a large reason why we have such impressive mode share numbers. As I said before: infrastructure matters. Five SkyTrain stations for 60,000 people is one of the remarkable advantages of being a compact City in the centre of Metro Vancouver.

The problem comes down to frequency. The diagram above shows AM peak time frequency, which is pretty good. However, the City’s largest employers are not 9-to-5 mom-and-pop operations, but have a lot of shift workers. We also have a large lower-income population (due in part to our large rental property stock) who tend to also work in less traditional jobs. The reality for many people in New Westminster is that much travel for work is during off-peak times, not to mention travel for play. When transit service drops to 30-minute, or 60-minute service, it becomes completely unreliable. Especially in a place like Queensborough, where 20 minute bus service for a 15-minute ride to 22nd Street mean that it can take more than half hour at the best of times just to get to the SkyTrain Station you can see over there on the Hill. Once we hit 11:00pm, much of the transit frequency drops to zero.

Of course, this is slightly out of the scope of the City’s jurisdiction. there is little the City can do but lobby TransLink to improve service. Unfortunately, we have been going the other way of late. What the City can do, however, is make sure the infrastructure on the ground is there to make using transit easier. For example, the City, as a general rule, lacks bus shelters. There are also several decisions we can make to prioritize cycling and pedestrian infrastructure improvements at vital transit links. Finally, we can (outside of the Master Transportation Plan, mind you), continue to encourage dense land use at our major transit nodes, like we have done at New Westminster Station and Sapperton Station, and are looking at for Braid Station. How long can 22nd Station be surrounded by single-family homes?

Slide 16 was interesting for a few reasons. The numbers that really stood out for me were 4,300 vehicles per hour on the 4-lane Pattullo Bridge compared to 3,500 on the 4-lane Queensborough. Who would have thought the lil’ old Queensborough carried more than 80% of the traffic of the big bad Pattullo? But I also don’t understand how we can have 3,000 vehicles on both sides of the bridge, and 3,500 over it… there is something funny about where these numbers were collected and what they mean for that intersection. I suspect this also argues pretty strongly towards saving a ton of money and and replacing the Pattullo with a 4-lane structure, but another post, another time.

It was also interesting to see how the intersections we all love to have performed when measured by actual delay relative to their design traffic volume. The stand-out is that the intersections on 20th perform fine. This is a surprise to those backed up on 20th, or the people of Connaught Heights and the West End who can’t get out of their neighborhoods due to the gridlock on 20th. this also reflects a recent traffic study on 20th done for the City by another consultant. Long and short: the problem is the Queensborough Bridge. There is nothing we can do to improve 20th street except re-write the Queensborough Bridge interchange, which is Provincial jurisdiction and has already had a large pile of cash concerted to concrete in recent years. I think the best approach here is to re-design the traffic system on 20th to make it most useable by Connaught Heights and West End Residents, and let the through-traffic back up into Burnaby. It may be a simple as redesigning the traffic light signals…

The good movement info here is the one that has so far generated he most discussion. The City’s survey has reflected what TransLink has said all along: about 40% of New Westminster Truck traffic either starts or stops its trip in New Westminster. A few people (including Councillor Chuck Puckmayr at last week’s Committee of the Whole meeting), have questioned the methodology that resulting in the is number. I am more interested in where in the City these stops are happening – are they all going to Kruger? Are they driving across the Pattullo and through downtown on the way to Queensborough landing? Just how many semi-trailers full of food does Safeway need in a week? I really don’t know. But that kind of data might be useful.

The other part of the goods movement discussion is already popping up in the local media, and that is the future of truck traffic on Royal Ave. I think the issue of directing all commercial traffic to Royal is a big one, especially as we are going to have another school built right on it. There is strong pressure right now to make Royal the truck route, which will effectively increase total truck flow through the City. I think that is a bad idea, for various reasons. But again, this is already too long a post, so I will address that another time.

Discussing the Parkade- Part 1.

There was a meeting tonight hosted by the Downtown New Westminster Business Improvement Association, on the topic of… well… a few things.

The main point of discussion was the future of the Front Street Parkade. There have been reports to council talking about longer-term visions for the waterfront, most of which include the partial or complete removal of the Parkade. These are supported by some engineering reports that indicate the maintenance costs for the Parkade are likely to go up significantly in the next few years, and some decisions are going to need to be made about how much to invest. It probably doesn’t help that some uppity bloggers have been calling for the end of the Parkade for a while now…

Naturally, there is a significant number of downtown business owners who see ample, inexpensive parking as fundamental to their business success. When others (be they uppity bloggers, City Staff, or Elected types) start talking about taking away their parking, they get a little itchy.

The BIA also had some gripes about back-in angle parking on Columbia, Bike Lanes, bike lanes, and Pay Parking on Sundays, but those issues seemed to be brushed aside as the conversation centered on the past, present and future of the Parkade.

Present were about a score BIA members, and about the same number of non-BIA types (including downtown residents and uppity bloggers), a few members of the local media, and from the City: Mayor Wright, Councillors Puchmayr, Harper, and MacIntosh, and Jim Lowrie from the City Engineering Department.

The Transportation Committee of the BIA provided a PowerPoint presentation with a lot of words on it (all caps lock), and perhaps I will go through that presentation in detail in a later blog post. Now I want to more talk about the spirit of the room and the nature of the conflict, from an uppity blogger point of view.

If I’m not sure how to summarize what the BIA’s complaints are, it might be because they have not come up with a coherent message. It is clear they do not like that others are suggesting they are going to take away parking. They feel that access to parking, and more specifically, access to the entire Parkade that they (or their ancestral business owners) financed and built, is not only necessary for their businesses, but is vital to the future of the City’s business community.

Arguments that the Parkade is underutilized are countered by the BIA suggesting the Parkade is too expensive and not effectively marketed. Dan O’Hearn from the BIA went so far as to suggest that if the Parkade was put under the control of the Downtown Merchants, it would be filled to capacity providing revenue for the Merchants. This reflected a certain spirit in the room that the Parkade is just poorly managed by the City.

To me, this argument has always sounded like cognitive dissonance, to argue on one hand that every parking spot is needed and that Downtown suffers from a lack of parking, then to argue on the other that the Parkade needs to be more effectively marketed to get people to use it. Are they saying there isn’t a lack of parking Downtown, there is a perceived lack of parking? Or are they saying there are simply not enough parkers downtown? How is either an argument for investing in a mostly-empty Parkade?

Even then, whose responsibility is it to advertise the availability of parking in the Downtown Parkade? The City? The Parking Commission? Dare I say, the BIA?

However, I think the main complaint I heard was that the BIA was not in the loop about what is going on. I heard a lot of people unhappy about not being consulted, and more than a few people worried that the Parkade would be going away this year, with no plan to accommodate the people who currently use the Parkade. It may only be 30% used on most days, but that still represents more than 200 parking spots.

I think this is where there is agreement between the BIA and uppity bloggers like me (and other people who are looking forward to a pedestrian-friendly Front Street connecting Downtown with real human ties to the Waterfront). We agree that there needs to be a plan. I just happen to think we need to look at confirming our current and future parking needs, then planning to accommodate those needs as we develop the Downtown, with the eventual goal being the removal of the eyesore Parkade from our waterfront. The BIA just doesn’t want the Parkade removed until there is a plan in place to accommodate Downtown parking needs. Some might think we are looking for the same thing.

Jim Lowrie and the Councillors at the meeting said as much. The plans they have read regarding eventual Parkade removal have been mid- to long-term planning documents. The City has no intention of removing the Parkade until the BIA and other stakeholders have been consulted, and until there is a comprehensive plan to address the current and future parking needs of Downtown.

So the two sides are not too far apart, and the City is right in the middle. This shouldn’t be to hard, should it?

The Port Declares War

Jeff Nagel of Black Press (who is turning out to be the best Municipal Affairs reporter in the local Dead Tree Press) wrote a piece on recent proclamations by the new CEO of Port Metro Vancouver, and the reactions from various groups throughout the lower mainland.

My first reaction was – poor bastard from England has no idea what he is doing wading into ALR politics. Then I did a little research and see that CEO Robin Sylvester was party to the sell-off of part of BC Rail, so he is obviously aware of (and not afraid of) the worst of BC political morass. He knows exactly what he is wading into here, the poor bastard.

My issue with the Port Authoirty is not just their stunning disregard for the spirit of the Agricultural Land Reserve (even if, as a Federal Agency, it doesn’t apply to them), but their business model. It isn’t just farmland that the Port has declared war upon, it is our roads, our waterfronts, and the livability of our cities.

All of this discussion skips over the reason we have n ALR. It is because BC has very little high-quality farmable land, and most of it is very close toVancouver. Once farm land becomes industrial land, commercial land, or a neighbourhood, it is neigh impossible for it ever to be returned to agricutural use. none of these characterisitcs are true for Industrial Land. Industrial Land can be located anywhere, and land that was once industrial can be easily converted to other uses – and land under other use can easily be converted to industrial use. All it takes is for someone to spend the neccessary money to convert the land. So the need for an “Industrial Land Reserve” is a red herring. There is no scarcity of land to put warehouses upon, although there is currently a scarcity of people willing to spend money to deveop industrial land, and a lack of willingness for Cities to provide appropriate industrial zoning within their land base.
Which brings us to the Port, an organization that is exploting these issues, and is rapidly getting out of the business of taking things off of and putting things on to boats. If farmland (which is commonly located right next to the River) is sacrificed for that, they may have an argument for balancing out industry and farming. Frankly, if the current buzz-word “Food Security” is our primary concern, it is no worse than Golf Courses or cranberry bogs, or even the 100-acre greenhouses being built on ALR farmland today:

140 Acres of our best Reserved Agricutural Land in Delta

Except that the port isn’t using our prime farm land to take things on and off of boats. They are using it to take things on and off of trucks, something they can do on any land, really. No need to use ALR land. The only reason they choose to do this on ALR land is because they can buy ALR land at a fraction of the cost of non-ALR land. Since they are able to remove it from the ALR with federal fiat, they can convert it to valuable lease space for warehouses, instead of buying expensive commercial- or industrial-zoned land that municipalities have set aside for just that purpose.

This is because the Port is no longer in the business of taking things on and off of boats, they are now a real estate development and lease business. How else can one justify the purchase of more farm land in Richmond? Look at the port land adjacent to their recent purchase in Richmond:

click to zoom it. or go to Google Earth yourself

All those warehouses (actually there are more now, this photo is a little old), a new highway overpass to connect this land to the East-West Connector, and only one thing is missing: Docks. There is a single berth there for ships, where a single business moves wood pulp onto barges from the rails. Every other business there is truck-oriented, with only a couple even having rail spurs. This is the Port Authority business plan for ALR land. Buy cheap, develop, lease for cheaper than anyone else can. That’s the free market, I guess.

So what? Notice how much of the talk about the traffic issues in New Westminster are around “goods movement”. The issue always comes up of trucks crowding our roads, or our livability being eroded by the noise and pollution of all this container traffic on our roads. When people wonder why we aren’t using the river or the rails more, why there are all these trucks on the road. They aren’t bringing laves of bread to Safeway, they are shuffling goods from the actual Port to “Port Facilities” like these, and to the vast warehouse ghettos of places like Port Mann, Port Kells, and Port Coquitlam – all locations of huge truck warehouses, and all lacking in actual Port facilities to move things on an off of boats (with the occasional exception of logs and woodchips).

How will we ever make use of the goods movement opportunities of the River, when it is against the business interests of the Port Authority – the only agency with any jurisdiction over the waterfront?

Master Transportation Plan Open House 1

Yesterday was the first Open house for the City’s new Master Transportation Plan process. Right off the bat, it looked like the turnout was great. I would put the over/under on total attendance at 90, if you include the staff and a few City Councilors (but, notably, not the Mayor). It was no donnybrook, but for a preliminary information session held on a busy night, it was good to see so many people are interested in the process. 

The Open house featured poster boards with some of the preliminary info collected by traffic counts and surveys, and a short presentation providing details on some of the posters, and giving a broader view of the process ahead. There were also some opportunities to add your comments to post-it boards, and to fill out a survey of pretty general questions. I have a few comments on a few interesting facts and ideas provided by the posters and presentation, but I’ll cover those in a later post. Here, I want to talk more about the feeling in the room. 

From listening to the conversations, most vocal concerns could be summarized into one of three broad categories: 

1) Through-traffic is a problem, but we can fix it once and for all by doing “x”; 

2) The intersection of “x” and “x” is the worst in the City! It needs to be fixed; and 

3) Why aren’t more tickets given out to bad drivers / cyclists/ rat runners/ anyone but me? 

Of these, number 3 has the least to do with the Master Transportation Plan. It speaks somewhat to a poorly functioning transportation system if systemic lawbreaking is the normalized way to operate the infrastructure, but targeted enforcement is really a complex issue involving driver education, signage, the police, and the community. The Master Transportation Plan will hopefully result in a better-integrated system that reduces the bad behavior of users, but that is rather secondary to where we are here. If traffic enforcement is really a passion of yours, why no join the City’s Neighbourhood Traffic Advisory Committee… they always need help! 

Number 2 is sort of what this is about. The solutions found might pick out a few key intersections and areas for improvement of the transportation network, but the bigger ideas will come in answering questions about how we want our intersections and other infrastructure to work, and how the various bits of the infrastructure can work better together. 

Number 1 is a big part of this. However, I bet the problems are more complex that we think, and that the solutions will not be simple ones. Unfortunately, some of the problems will not have a satisfying solution at all (Queensborough Bridge, anyone?), but that doesn’t mean this process is not useful or cannot change the way we approach these problematic areas. 

After the presentation, there was a bit of time for a few questions from the audience, the answers to which I can paraphrase here (yes, both the questions and answers below are paraphrased, any error of fact or language is mine, I tried to catch the gist of the conversation, if not the detail). I have added my comments after each Q&A point. 

Q: You say 40% of trucks are going to a destination within the City, but what about the rest of the traffic? It would be interesting to see how much of the car traffic passes right through.

A: No answer was offered, as it seemed like more of a statement than a question. 

This, more than any other point, is the big gripe New Westminster has about traffic, and the gripe our neighbouring communities have about us. I concur that it is important for us to get this number, because it seems to range depending on whom you ask: 60%? 80%? More? And so much of the conversation in New West is about it, we should start from a factual base. The strange part in this discussion is that many people who think this is our #1 problem also think the solution to too much through-traffic is to blow the bank on building infrastructure to accommodate more through-traffic (freeways through, around, or under the City). 

Q: How does this align with the proposed Pattullo Bridge project?

A: The Pattullo Bridge project is the jurisdiction of TransLink, and will include its own public consultation process, likely starting as soon a February 

However, the data collected for this plan, the impacts of the Pattullo refit/replacement, and the impacts on New Westminster when the Port Mann II comes on-line with its tolls, will all need to be considered as part of the City’s planning. I didn’t get confirmation on this, but I assume TransLink will be one of the agencies identified as a key stakeholder in the entire MTP process.

Q: This City is right next to the River- is there any consideration to using the River for transportation?

A:We don’t know of any plans to move passengers on the river that have gone past the very-high-level concept phase, but there has been discussion of this in the past. Port Metro Vancouver will be one of the Agencies invited to take part, and they have been invited to have a seat at the table here

Goods movement on the River has been a pet peeve of mine for a while, but I will save my strong opinions about how Port Metro Vancouver is screwing the entire MetroVancouver area for a later post. 

Q: What is our clout, jurisdictionally? If TransLink and Province and our neighbouring Municipalities have different plans than us, what can we do about it?

A:Some roads in the City are Provincial, some are part of the Major Road network, and are TransLink, but most are owned by the City. We work with these other agencies, and also, the UBE experience taught us that a strong, united community can have an influence. Experience has shown that a City that has a well-articulated Master Transportation Plan is in a better position to negotiate with other agencies to protect the goals of that plan

This was a great answer, and speaks to the importance of us not only putting a good plan together, but also acting on it to demonstrate that our community supports the goals outlined in the plan. 

Q: What about the UBE, are we going to address that issue as part of this?

A: If the UBE is identified as an issue during this process, then we can look at potential solutions to that issue. However, TransLink has taken the UBE off the table, and are not planning to build it anymore. That project was a TransLink one, with some Federal money. 

The UBE is dead, and the North Fraser Perimeter Road is at least in a very, very deep coma, the chances of it coming back are not nil, but are vanishingly small. But many of the problems highlighted in the UBE discussion (rail crossing safety, access for the Braid Industrial Area, the Braid and Brunette intersection) have not been addressed once TransLink’s approach to the solution was found to be unacceptable. I think there are creative solutions to these issues, and I hope having TransLink, that railways, Port Metro Vancouver, the Truckers and Coquitlam at the table will help us find some common understanding on these issues, if not a solution. 

Q: Are we working with the neighbouring communities, and have Urban Systems tracked the success rate of their previous clients for these types of Plans?

A: First question: Yes, neighbouring Municipalities will be involved in Agency Workshops. Second question: Yes. In their experience, most clients have implemented 50 – 70% of their plans 10 years after the plan is finalized. An interesting nuance is that sometimes the projects completed are not those that necessarily best suit the goals set forth in the plan.

That second part might need some clarity, I can think of an example where a City with the goal of “Improving Pedestrian Safety” may get a big grant to build a connector road in an underserviced area, but defer the sidewalk improvements to a later date, to take advantage of a short-term funding opportunity. Or someone like Rob Ford gets elected and decides to tear up an integrated cycling network, and replace street cars with subways, resulting in increased car-dedicated road space. Even the best laid plans sometimes get nuked by bad politics. 

If you missed the Open House, there will be another opportunity on the afternoon of Valentines Day at Century House. Nothing says “I love you, Honey” like skipping off work to take your date to at a community open house on transportation policy planning.

The MTP Begins

Thursday night is the first open house for New Westminster’s Master Transportation Plan. The first meeting will mostly talk about the process to come over the next 12-18 months, and there will be more public consultation, so don’t go in expecting to hear a lot of answers… but do expect to hear lots of questions, and be prepared to ask them!

The part I am looking forward to is the first bits of data coming from the City’s traffic measuring and public surveys. it will be interesting if the problems we perceive are the same as the problems shown by traffic counts and other data collected by the City and their consultants.

As for the path ahead, the new President of the NWEP, Reena Meijer-Drees does a great job getting Grant Granger at the NewsLeader updated on what the vision that group has for the future of transportation in the City. This is a great start. 

 There was also a great short article in the March Walrus Magazine (I suspect you non-subscribers will have to wait a month or so until you can read it on-line, or pick it up at the Library) talking about Luc Ferrandez, the Mayor of the Montreal’s Plateau borough. Being both a cyclist and and a believer in contemporary urbanism, he has been turning one of the most storied and historic neighbourhoods into a pedestrian-friendly paradise of wide sidewalks and green spaces.

Limited in his powers by a Metropolitan Government that oversees all major transportation infrastructure, and facing opposition from neighboring communities whose denizens want to commute through the Plateau unfettered by his neighbourhood traffic calming, Ferrandez is unapologetic. How unapologetic?

“I accept that some people think I’m the Devil. For them, the Plateau doesn’t exist. It is just a place to be driven through. I don’t give a shit about those people. They’ve abandoned the idea that humans can live together”.

Oh, to have the candor of Québécois politicians. However, when speaking about his vision for his neighbourhood, he sounds inspired:

“The Plateau is an Italian cathedral. It’s a forest. It’s something to protect, something sacred. I don’t want it to become a place where people come to live in a condo behind triple-glazed windows for a couple of years. This has to be a place where people can be comfortable walking to the bakery, walking to school, walking of the park – where they want to stay and raise a family”.

Will anyone stand up and say they want anything less for British Columbia’s most historic City?

Confessions of a Greenpeace Dropout Review – Part 6 – Fit the last.

As diligent readers are aware (Hi Mom!), I have been ploughing my way through Dr. Patrick Moore’s dissertation on “Sensible Environmentalism”. What started as a review turned into a lengthy criticism. This is the last fit of a 6-part essay, and it is worth reading it all if you want to learn about how Patrick Moore and his Greenwashing company use misinformation, self-contradiction, and frankly absurd ideas to market everything from coal mining to salmon farming as “Green Industries” You can follow these links:

Although this book is full of ideas with which I disagree, and many ideas that are just flat wrong, I always suspected Dr. Moore at the least came by his ideas honestly, or for the most pragmatic reasons. His debatable ideas on clear-cut logging (the best thing one can do for a forest!) and fish farming (the only way we could possibly save the native salmon!) likely rise form his history working as a logger and a farmer of fish. His call to end government subsidies for wind and solar, while at the same time making the use of ground-source heat pumps mandatory, may have to do with his promotion to Vice President of NextEnergy: “the Canadian leader in designing and marketing geothermal systems for the home!”

Or maybe those are coincidences.

However, in his discussion of Anthropogenic Global Warming (AGW), Dr. Moore not only loses his remaining credibility, but loses any claim to being science-minded, skeptical, sensible, or an environmentalist. Coming from someone with the intelligence, training in science, and access to information that Dr. Moore is alleged to have, his arguments are so poorly thought-out, so anti-science, and so ill-informed, that it can only be the result of a disingenuous and callow disregard for the truth, and for the intelligence of his readers. I am going to waste a lot of words discussing this part of the book, because it is a microcosm of everything that is wrong with the current public discourse on AGW.

To get there, we have to first take a step back and talk about Duane Gish. Dr. Gish is a Young-Earth Creationist who met with some small fame holding public debates against scientists on the topic of Evolution. Dr. Gish brought to these his opinion that the Bible is literally true and that the Universe was created in a single 6-day fit about 8,000 years ago, in exactly the order that is written in Genesis. Clearly, this is a preposterous position to debate against a serious scientist with academic expertise in genetics, geology, astronomy, or, for that matter, physics or chemistry. That did not stop Dr. Gish. Paradoxically, audiences would quite often leave the debates feeling Dr. Gish had “won”. This is because he used a rhetorical technique that he wielded with such might and power that it now bears his name.

The Gish Gallop is a debating technique where one uses their allotted time to throw out such a large number of disconnected, unsupported, misrepresented or simply untrue “facts” that the opponent can only hope to refute one or two of them in their rebuttal time. After rebuttal, the Galloper ignores the countering points made by their learned opponent, and just throws out a new random pile of other points, or even the same ones slightly re-phrased, until the opponent is left to throw up their arms in frustration. It is less the shotgun technique than the M61 Vulcan technique.

The point is: for the Galloper, it is not important that you support any of your allegations with truth or data, or even if several of your allegations contradict one another – just keep shooting out stuff and let the poor bastard on the other side try to refute it all. To a general audience, one guy sounds like he has all the facts, the other guy can hardly refute any of them, so guess who wins? The Gish Gallop is well known by Creationist “debaters”, and has been adopted very successfully by people like Lord Monckton when discussing AGW. In skilled hands, it is an effective debating tool. It is also the mark of someone who knows that few of their actual arguments will stand to scrutiny on their own, so in that sense, it is the epitome of being disingenuous.

When I read Dr. Moore’s discussion of AGW, I couldn’t help but see Gish Gallop all over it. He, in turns, argues that it isn’t getting warmer, that warmer is better, that climate scientists lie, that scientists are incompetent, that most scientists don’t believe in AGW, that CO2 cannot cause warming, that the warming caused by CO2 is good for plants, that the ocean is not acidifying, that ocean acidification is good for corals, that human action can’t possibly impact the climate, that human activity might have prevented an ice age, that AGW will lead to more species, that sea level is not rising, that sea level rise is a good thing, that ice is not shrinking, that ice shrinking is a good thing…etc. etc. It is painful to read, mostly because it seems that Dr. Moore forgot that Gish Galloping does not work if those you are debating against have infinite time to refute each point one at a time.

Now I cannot hope to address each of his points here. Even given infinite time and near-infinite bandwidth, my patience to stupidity is not infinite, nor should yours be. So I am going skim the cream off the top of his Gallop, and allow you to find out for yourself if there are any curds below.

Dr. Moore’s discussion of AGW starts by suggesting there is no scientific consensus on AGW. This argument can be summed up into three Logical Fallacies: Argument from Incredulity, Argument from Authority, and Argument from Popularity.

The first argument is basically this:

“The subject of climate change… is perhaps the most complex scientific issue we have ever attempted to resolve. Hundreds, possibly thousands of factors influence the earth’s climate, many in ways we do not fully understand” pg. 330

This is a rather uncompelling argument. I hardly think measuring the basic energy flows of the earth’s atmosphere is all that more complex that, oh, I don’t know, tracking speed-of light particles with half-lives measured in the picoseconds at the Large Hadron Collider or unravelling the 3 billion base pairs in the Human Genome Project. Yeah, complicated, but hardly insurmountable, and with numerous lines of evidence from dozens of different disciplines pointing to the same conclusion, and a well-understood causation train, it is not really that big a scientific leap to conclude that increased CO2 output results in higher atmospheric CO2, which results in a stronger Greenhouse effect.

Argument two sounds like this:

“A comprehensive scientific critique of the IPCC’s findings… was signed by more than 31,000 American scientists and concluded, ‘there is no convincing scientific evidence that human release of…greenhouse gasses is causing or will cause catastrophic heating of the Earth’s atmosphere’. Clearly there is no overwhelming consensus among scientists on the subject of climate.” Pg 332

The 31,000+ name petition of which he writes is none other than the one generated by the venerable climate research foundation the Oregon Institute of Science and Medicine. You need to follow that link to see what they are about, seriously, take a look. It is telling that Dr. Moore talks about their work, and provides lots of references to them in this chapter (more on that later), but he clearly recognized that linking to this source would not improve his credibility. This is what I mean by being disingenuous.

I know, that is a bit of an ad hominem (although, ad hominem is actually a valid rebuttal to Argument from Authority), so lets take a closer look at the 31,000 scientists. You can see from the Petition Project Site that, of the 31,000, exactly 39 self-declared as Climate Scientists. This in comparison to the 2,000+ Climate Scientists who took part in the IPCC Working Group that the Petition Project was a response to. Sounds like something close to a consensus there. What of the other 30,961 scientists? A random mix of biologists, geologists, computer scientists, chemists, engineers and medical doctors. Yes, more than 13,000 were trained in medicine or engineering (I know my podiatrist has strong feelings about Climate Change, but does his M.D. really represent authority on the subject?) The only selection criteria for this Petition Project is that you had to get at least a B.Sc. in some physical science field, or medicine, or engineering. To put that in perspective, there are, according to recent counts, at least 10 Million Americans who have received their B.Sc. in an applicable discipline since 1970. So the 31,000 represent about 0.3% of “American Scientists” the way the petition itself defines them. I dunno, 99.7% sounds pretty close to a consensus to me.

As an aside, they seem to put a lot of emphasis on the scientific credibility of TV weather forecasters. I rest my case.

Ultimately, the Petition Project is a marketing exercise, not a scientific survey. It was a voluntary on-line sign-up, with no vetting of actual credentials. Luckily, a scientific analysis has been done, judging the opinions of climate scientists, other scientists, and the general public. It seems the consensus decreases the less people actually know about the climate and about science. Likely the Dunning-Kruger Effect. Which brings us to Argument #3.

This third argument is a general discussion of how the general public doesn’t believe in AGW. He quotes a bunch of public opinion polls indicating the “man on the street” does not believe in AGW. Or even that people don’t believe that other people believe in AGW, like that is relevant to the scientific certainty of the issue:

“ a poll taken by Ipsos Mori found 60 percent of Britons believed ‘many scientific experts still question if humans are contributing the climate change’. Clearly a majority of the British public does not believe there is a scientific certainty on the subject”. Pg 334.

Now, I hate to sound like a weedy academic elitist, but polling public opinion about the opinions of researchers is not really the best way to find scientific truth.

Do I really need to say that to a guy with a PhD?

Again, for perspective only, I can list things that a majority of Americans think, according to polls similar to the ones Dr. Moore cites, and you can decide if these are, therefore, scientific facts:
80% believe in the literal existence of angels;
78% believe Evolution by Natural Selection is false;
60% believe that Noahs Flood actually happened.
So much for the wisdom of the majority.

Soon after this, Dr. Moore’s honesty takes another dive. There is a bit of intellectual dishonesty that people often engage in, on both sides of this discussion: “cherry picking” data. This is a type of scientific fraud where you pick data that supports your theory, but disregard data that does not, without any justification for that dismissal. Aware of this concern, Dr. Moore says:

”I will try not to ‘trick’ the reader by cherry-picking timelines that support a particular bias” pg336

Then, on the bottom of the very same page he engages in this blatant piece of cherry picking:

”Since 1998 there has been no further warming and apparently a slight cooling” pg336

On… the… very… same… page. He also engages in timeline cherry picking in other areas, such as on Page 344, alleging “cooling” between 1940 and 1980 (when there was actually a slight slowing of the continued warming trend), but let’s concentrate on the first cherry pick, as it is very commonly heard in the Anti-AGW noise.

The grain of truth in that pile of bullshit is that 1998 was previously thought to be the year with the highest average temperature ever recorded by surface-based instruments since reliable instrument records began around the turn of the previous century. It is more commonly held now that 2005 and 2010 were both warmer, with the benefit of more robust analysis. The argument about 1998 vs. 2005 vs. 2010 is kind of irrelevant, though, seeing as how the nine of the hottest years recorded have happened in the last 10 years, with 1998 being the one outlier. Plain and simple: the world is getting hotter at a rate unprecedented in our recorded history, or in the proxy record (Tree rings, varves, coral layers, ice cores, etc.). Surface temperature logs are not the only effect that we measure that demonstrates AGW.

The importance of Rate of Change is a topic that Dr. Moore completely ignores. In 15,000 words on AGW, where he often mentions that the temperature has been warmer in the past (ironically putting trust in scientists who make assumptions about the earths temperature millions of years ago, but not trusting them when they suggest that it is warming now…. cognitive dissonance much?), he never mentions that the rate of temperature change is as important as, if not more important than, the actual amount of change.

This is strange, because Dr. Moore spends a bunch of time talking about how easy it will be for the planet’s species (including people) to react to climate change (after denying it exists). The scientific literature has been pretty clear in demonstrating that adaptation to natural epochal shifts in temperature is a normal part of the world’s ecosystems, but it is the century-scale shifts of multiple degrees that will cause most of the negative ecological effects of AGW. There is no way the boreal forests will have time to shift north if the planet’s temperature increases markedly over less than a century, to give a single example.

Dr. Moore even talks about how the planet was warmer 9,000 years ago by almost 3 degrees during the Holocene Thermal Maximum (which he actually lies about, since the HTM was a regional temperature trend driven by the recession of the Laurentide Ice Sheet, not a global trend, and it was only about 1.6 degrees warmer in areas than today.) but not notice that 3 degrees over 9,000 years is a much different thing than 3 degrees over 100 years. I suspect he is being deliberately obtuse here, or he just hasn’t read the science.

Or maybe he figures the researchers who spend their lives studying historic climates don’t know about the HTM, just like he assumes NASA doesn’t know how to locate or read thermometers. This is the basic accusation he makes against NASA and NOAA. On Page 337 he purports that the Urban Heat Island Effect is causing us to observe increasing temperatures due to local effects only (blithely assuming the scientists at GISS and NASA, who I note are able to put a freaking temperature probe into orbit around Jupiter – haven’t thought about this little detail).

Then on Page 345 he accuses NASA of deliberately removing the “colder” thermometers (an accusation of scientific fraud that has no actual data to support it, and nonetheless has been proven false) to lead to a false conclusion about current temperature trends. He is conveniently avoiding mentioning the myriad of other ways we measure the earth’s temperature aside from the surface thermometer record, such as ocean temperature, satellite observations, and dozens of proxy techniques.

With his scientific credibility tied to Ecology, Dr. Moore, should know more about plants than he is letting on. Perhaps this points to his lack of Masters research, and his apparent lack of academic publishing after his PhD (which was a study on mining policy and local tidal effects). So when he states that the measured increase in atmospheric CO2 is good for plants – and uses some ridiculous horticultural greenhouse studies to support his argument – it is hard to give him the benefit of the doubt and assume he knows not what he thinks.

Dr. Moore (taking cues from other climate change deniers) takes the argument to the most ridiculous extreme on Page 352, suggesting that if human society and the industrial revolution hadn’t come along to produce all of this CO2, then plants probably would have died out from lack of CO2 (wait, didn’t he, few pages earlier, argue that most of the CO2 increase was natural? Yikes).

While it is true that in a hydroponic greenhouse system where there is an infinite supply of all nutrients available to plants, CO2 (which is not plant “food”, but is more plant “air”, to correct the allegory) may become a limiting factor in growth. In this case, adding more CO2 may hasten the growth rate of plants in that very specific, tightly controlled environment. Of course, this translates nada to the real world outside of greenhouses or basement pot farms. The reason for this, as Dr. Moore surely knows, was well understood in the 1800s, when Liebig developed his Law of the Minimum.

Like most biological ideas form the 1800s, this makes perfect sense to even the uneducated in the subject today. Plants require a suite of nutrients to grow: CO2, water, nitrogen, phosphorous, potassium, calcium, etc. Liebig demonstrated, using fertilizers, that their growth is limited by one “limiting nutrient”. That nutrient in nature is usually either water or nitrogen (or, more specifically, the ability of soil bacteria to fix nitrogen). This makes sense to animals too, if you are deprived of water and carbohydrates, no amount of oxygen in the world is going to keep you alive for very long. In reality, increasing atmospheric CO2 enough to dramatically raise atmospheric temperatures will have a negligible effect on plant growth rates, and if it did, it would likely dramatically increase demand for nitrogen in the soil – already the limiting factor for most commercial farming. Even this response is likely to be short-lived and have severe negative repercussions. Don’t take my word for it. And certainly don’t take Dr. Moore’s.

Idiotic is the word that comes to mind when Dr. Moore starts talking about sea ice. He ignores all of the data currently available (on the very website he cites!) that demonstrates that Arctic Sea Ice is continuing to decline in mass, not recovering from 2008 levels as he implies on page 359. He takes one graph from the Cryosphere Today, claiming it shows no reduction in sea ice, yet fails to cite this graph from the same page, or this one, or this one from Antarctica. He also falsely claims that

“Our knowledge of the extent of sea ice in the Arctic and Antarctic began in 1979, the first year satellites were used to photograph the polar regions on a continual basis” Pg 359

This is stunning ignorance. Sea ice was measured by mariners for hundreds of years prior to 1979, and even longer by Inuit. There are also ice cores (which tell us the age of any single piece of sea ice), and dozens of analysis techniques that can be applied to arctic sediments such as varving of sea-floor sediments around arctic deltas, palynology records, arctic flora and fauna growth patterns, and other techniques to trace back the history if ice on both poles. This is another Argument from Personal Astonishment. I don’t know if you noticed, but we know there was ice over Georgia Straight 15,000 years ago, even when we don’t have satellite photos to prove it!

One has to wonder about his ability to do basic journal research when reading his discussion of ocean acidification. On pages 361-362, after quoting a paper by Orr et al that states “Between 1751 and 1994 surface ocean pH is estimated to have decreased from approximately 8.179 to 8.104 (a change of -0.075)”, Dr. Moore replies writing:

”One has to wonder how the pH of the ocean was measured to an accuracy of three decimal places in 1751 when the concept of pH was not introduced until 1909”

Well, one does not have to wonder, because one actually cited the actual freaking scientific paper! All one has to do is read the paper one cited. If one does that, though, one finds the paper cited by Dr. Moore contains no such quote! The quote seems to have been lifted from the esteemed scientific journal Wikipedia, as it appears in the introductory paragraph on the Wikipedia entry on “Ocean Acidification” , although with less precise numbers (which further erodes part of Dr. Moore’s original whinge, doesn’t it?)

Clearly, Dr. Moore didn’t even bother to read the papers he mis-quotes, nor did he bother to read the papers that Wikipedia cited as the source of the quote, because that paper from JGR explains that ocean-atmospheric gas exchange can be very accurately determined if you know the chemistry of the ocean and atmosphere, and a bit about temperatures (all of which can be currently measured from proxies, such as sediment cores, carbon and oxygen isotopes, and coral ring growth). Just because pH hadn’t been discovered, doesn’t mean it didn’t exist. Gravity existed before Newton, you nitwit.

Can we all agree that the days if citing Wikipedia in any discussion about anything other than Wikipedia is irrelevant? It is the internet equivalent of citing the Encyclopaedia Britannica while writing our grade 9 reports on Argentina – the teacher didn’t like it then, and they wouldn’t accept it now. But Dr. Moore cites Wikipedia no less than 12 times during his discussion of AGW.

This crappy citation rigour is, unfortunately, a trend continued during Dr. Moore’s brief Gish Gallop on pages 345-346 to how scientists used to predict a new ice age was coming, providing two excellent references: Spiked Online and something called ZombieBlog. I wonder if their scientists signed the petition.

Yet another argument from Dr. Moores’ personal incredulity is to question if the increases in atmospheric CO2 are actually man made, or just a natural trend; after all, CO2 has been higher in the past.

“Many scientists assume that human emissions of CO2 from burning fossil fuels are the main cause of this [observed] increase [in atmospheric CO2 since 1958]. Some scientists question this assumption.” Pg 336.

This is such an important point of contention, he raises the question rhetorically a few pages later:

”Is CO2, the main cause of global warming, either natural or human-caused?” pg 338

Except this is not an assumption made by scientists, nor is it a rhetorical question, it is an observable phenomenon. Atmospheric scientists can differentiate CO2 from natural and anthropogenic sources, using carbon isotopes . It is pretty clear from isotope analysis that the observed increases in atmospheric CO2 during the 20th century are dominated by fossil fuel burning. If  “some scientists question this assumption”, they need to come up with some data to support their point. They haven’t.

There are other topics of scientific illiteracy in this book, but at some point they are coming on so fast and so erratically, that response would be futile. Pure Gish Gallop Gold. Dr. Moore’s profound lack of understanding hydrology leads him to opine that glaciers don’t do anyone any good (Pg357). He suggests a warmer world is better because… wait for it… people like warm weather and can freeze to death when it isn’t warm enough (Pg340). Since wetlands are so good for migratory birds, what’s the problem with rising Sea Levels (pg366)? After a while, throwing this terrible book against the wall was causing me repetitive strain disorders.

Speaking of repetitive strain, Dr. Moore also jumps into “Climategate”. The book first makes a passing reference to this alleged scandal early in his discussion of AGW:

“in November 2009…thousands of emails, leaked or hacked from the Climate Research Unit of the University of East Anglia in the U.K. shocked the climate change community. These revelations were quickly dubbed ‘Climategate’” p337

After a paragraph introducing the topic, Dr. Moore Gish Gallops off to talk about the Copenhagen Conference, causation vs. correlation, polar bears, climate changes over time, etc. for 7 pages, before mentioning “Climategate” again in another stand-alone paragraph

“…the revelations of ‘Climategate’ in November 2009 … clearly showed that many of the most influential climate scientists associated with the IPCC have been manipulating data…”pg 344

There is another drive-by mention a page later, where he at least mentions there were inquiries in to the “scandal” (but fails to mention the scientists were exonerated in all inquiries, and many newspapers were forced to retract their stories previously written about the “scandal”. After no less than 22 pages of random garble on a variety of unrelated topics, Dr. Moore once again raises the topic of “Climategate”, in perfect Gish Gallop technique: if you mention it enough, the words will stick, even if you don’t make a convincing case.

It is actually this fourth mention of “Climategate”, 368 pages into his 390 page book, where Dr. Moore cements the case that he was not interested in the truth. He actually repeats the basest accusations of “Climategate”, the ones that forced reputable newspapers and media outlets to retract the story once they were found to be false. He dismisses the three separate independent inquiries in to the scandal that exonerated the scientists as “whitewashes”. He very clearly did not read the “damning” emails in context, nor did he read the results of the inquiries into the scandal. The only newspaper he cites is the Telegraph of UK, the only one not to retract its “Climategate” reporting.

He also accuses the journals Science and Nature as having “a marked bias in support of human-caused climate change”. It is apparent he is talking about the magazines, but he may as well say the same thing about actual nature (which keeps reacting predictably to a warming planet) and actual science (which keeps finding more evidence of AGW).

Sorry, Dr. Moore. No “Sensible Environmentalist” can continue to ignore both science and nature, and maintain their credibility.

My final review? Don’t read this book. It will make you dumber.

Freedom of Crappy Information

The Wikipedia/Reddit/BoingBoing protest today, stemming from the SOPA “Stop On-line Piracy Act” and PIPA “Protect Intellectual Property Act” legislation pending in the Excited States was an interesting event. The issues are huge: freedom of information versus ownership of intellectual property, or at least how it is being cast.

So I am now going to do a long rant, almost as dull as looking at Wiki’s black page today.

I think this is a fight between an existing paradigm for information and the opening of a whole new world. This is nothing less than the first strike in WWW 3.0 with the old media (print, music, motion pictures) finally understanding that their business model is dead, but they will not go down without a fight.

More importantly, this is definitely NOT a battle between the “little guy” every day internet users and file sharers and the “Big Business” people who make movies, run record labels and produce printed materials. The antagonists here are one type of Big Business (record labels, movie studios, TV networks) and another type of Big Business (Google, YouTube, Wikipedia). To one group, the “little guy” is a customer, to the other, he is the product.

The Business Model of the old media was to sell you content. You bought books, you bought movie tickets, you bought records. Pretty simple. The improvement on this was the business model where you get the print or the music or the movie for free, but they sell your attention to a third party advertiser – Newspapers, Radio, Network TV. The internet has completed that transition, as the only real product Google has is your attention. “Free” websites like those taking part in the blackout today make a lot of money selling your attention to advertisers. Wikipedia is amazing, as they get the users to generate the content – as I guess Google does with things like Blogger, the “free” host of this blog. I’m sure they are tracking your use, after all, I can go to Google Analytics and download the stats for my Blog, including stats about the people who visit it.

There is a market for websurfing info. I met a woman a few weeks ago who worked for a marketing research firm, they had 200,000 “volunteers” who shared their surfing habits with her firm, who stats-massaged them, then packaged them into marketing plans for clients. She was reluctant to mention it, but admitted most of the “volunteers” had no idea they were volunteers, because they didn’t read the small print of their terms and conditions of some free “App” they downloaded for their iPhones.

So the SOPA and PIPA Act battle seems to be between old media, who want to make you pay for content, and new media, who want to make it easy for you to share content you may or may not have created, as long as they can track how you do it.

Frankly, I don’t give a damn. Because the internet is, in my opinion, too big to stop. The whole purpose of it is to be distributed, and the Old Media types can whack-a-mole with sites trying to “steal” their intellectual property, but they will never win. I’m not saying it is right, I’m saying it is reality. Old Media would be better served trying to update their business models before they join the buggy whip and quill pen industries in the dustbin of old ideas.

Still, isn’t this about freedom of information, you say? I guess it is, but primarily, it is about the freedom of bad information. If I am going to fight a freedom of information battle, it will be against Elsevier and the Research Works Act. This Act will ensure that the private sector will be rewarded by having the exclusive distribution rights to academic research papers.

For those of you not in science, let me explain. When Patrick Johnstone, researcher, does a bit of science, he tries to publish it in a peer reviewed journal. You do this for several reasons: it provides legitimacy to the research you do, to is the easiest way to share your data and results with other researchers – they an vet it, they can prove it wrong, they can build on it. That is how science works. We also do it because our worth as a researcher is often measured by out ability to publish original work in academic journals. In a sen se, these journals are the currency of science

In the good old days, these journals were produced by academic organizations, say, the Canadian Journal of Earth Sciences is produced by the Geological Association of Canada. If you were a member of the GAC, you got a copy of the CJES every month, and Academic institutions would get copies for their libraries. Students and other researchers would go to the library and (illegally – but that’s the grey area) photocopy papers from the Journal and cite them, learn from them, etc.

However, publishing and producing these journals, distributing them to libraries and – this is more important – creating on-line access to papers and searchable databases of their content – soon became the interest of a few Multi-national Corporations. Elsevier being probably the biggest. This is a Dutch company that also does great business running arms shows, but that’s another story.

As a result, for scientific researchers to share data over this great technology designed originally to allow scientific researchers to share data – the internet – they gotta pay Elsevier or the like. If you search for the paper I wrote in 2006 while doing my Masters, you can find it mentioned in library search engines, but if you try to read it on line you get this. Ingenta Corporation owns it. You can read it for $40. Trust me, I will see none of that money. I have no right to that intellectual property.

Or you can do what this instructor has done, and put it on line illegally. Which I, as the author, might be OK with (no money out of my pocket), but the Research Works Act wants to make sure is very illegal.

I can hear you now – Who cares? A couple of tweedy-sleeved academics can’t own their vanity projects. But the problem isn’t my little paper about some obscure rock outcrops in the Gulf Islands (talk about Crappy Information!), it is about how academic data will be kept more separate from the public at a time when the entire world is shifting towards freer exchange of information.

So when the topic of Anthropogenic Climate Change comes up, a crank like “Lord” Monkton can make a bunch of bald assertions about how CO2 is good for plants, and therefore climate change isn’t a problem, then back it up with an opinion piece in the Daily Telegraph and a blog post put out by the Heartland Institute. It’s all bullshit, but it looks legit to the average reader. How is a curious person to know? A well-intentioned scientist could refute the points made by Monkton with a ream of scientific data to the contrary. She could even give you links to 20 or 30 peer reviewed scientific articles that clearly demonstrate the falsehood of Monkton’s statements. But you won’t be able to read them unless you pay $100 or more to get past Elsevier’s paywall.

So the freedom of information question to me is this: What is the fate of our discourse in the WalMart world if bullshit is free, but factual scientific data costs large?