Looking back at blockbuster trade between LCS-bound Dodgers …

Carl Crawford and adrian Gonzalez, Dodgers

Carl Crawford and Adrian Gonzalez never got as far with the Red Sox as they have this year with the Dodgers. (Adam Davis/Icon SMI)

The Dodgers and Red Sox are the first two teams to secure berths in their respective League Championship Series, with the former dispatching the Braves from the National League Division Series on Monday night and the latter eliminating the Rays from their American League Division Series on Tuesday. Whether or not the two teams take the next step to meet in the World Series, their current shape owes much to the stunning blockbuster trade they pulled off on Aug. 25, 2012, and as such, it merits a look back.

The deal allowed the Red Sox to clean house after the team’s September 2011 collapse morphed into an even unhappier 2012, one in which Boston lost 93 games, its highest total since 1965. In sending Josh Beckett, Carl Crawford, Adrian Gonzalez and Nick Punto to Los Angeles, the Sox freed themselves from more than $270 million in future salary commitments while unloading players generally perceived to be unhappy in Boston for one reason or another. Crawford remains bitter about his Beantown experience, and neither he nor Gonzalez would talk to Boston-based muckraker Dan Shaughnessy when the Sox visited L.A. for an interleague series that coincidentally marked the one-year anniversary of the trade.

The players the Red Sox received in return — Rubby De La Rosa, Ivan De Jesus, James Loney, Jerry Sands and Allen Webster — made little direct impact on the 2013 team; in fact, De Jesus, Loney and Sands were gone from the organization before the season started. Instead Boston filled the spots of the departed players with lower-cost free agents such as Ryan Dempster, Jonny Gomes, Mike Napoli and Shane Victorino. As a result, not only did the team cut its Opening Day payroll from $175.2 million in 2012 to $154.6 million this year, but those signing by and large paid off handsomely as the Sox won an AL-high 97 games and claimed the AL East flag for the first time since 2007.

For the Dodgers, the acquisition of the aforementioned quartet wasn’t enough to spur a run to the playoffs in 2012, but this year, those players did help Los Angeles to 92 wins and its first NL West title since 2009. While the amount of salary the Dodgers took on at almost no discount came as a shock, it did nothing to hinder the new ownership group — which had purchased the team for a record $2.15 billion in the spring of 2012 — from continuing to spend money. Moreover, the additions anticipated the erosion of the free agent market as a means of acquiring high-end talent, though it remains to be seen whether Gonzalez and Crawford, in particular, will maintain their value as they age.

When looked at from a strictly sabermetric standpoint, calling the trade a win for both sides may not make total sense. For 2013, the Dodgers netted 7.0 Wins Above replacement from the four players they received, at a cost of $58.25 million — not a good return given that the cost of a win on the free agent market is about $5-6 million. Even so, Los Angeles got significantly more value from three of their four roster spots relative to 2012, and that expenditure didn’t prevent them from pushing their payroll above $200 million. While Beckett, Crawford and Gonzalez may not earn their entire keeps over the remainder of their deals, the Dodgers at least didn’t have to sacrifice any compensatory draft picks to get them.

On the Red Sox side, the two players still with the team came in at -1.2 WAR for some prorated fraction of the minimum salary, but their value lies in the years of club control they still have remaining as well as the payroll flexibility that enabled general manager Ben Cherington to make moves that helped return the team to its winning ways.

What follows is a closer look at the trajectory of each player involved in the trade, starting with the Dodgers as their end of the deal is more visible at this time.

Los Angeles

Adrian Gonzalez
Amid a barrage of injuries, the 31-year-old first baseman was the Dodgers’ steadiest player in 2013. His 157 games were 15 more than Andre Ethier and 25 more than any other regular, and both his 22 homers and 100 RBIs led the team.

Gonzalez’s final batting line (.293/.342/.461) wasn’t tremendously impressive, but it was a virtual carbon copy of his 2012 performance (.299/.344/.463) in a more pitcher-friendly context; as a result, his OPS+ rose from 117 to 126, and his Wins Above Replacement from 3.5 to 3.9. That’s still a ways off from his 2006-2011 peak (.297/.380/.520 for a 144 OPS+, with an average of 31 homers and 4.5 WAR) via five seasons in San Diego’s Petco Park and one in Fenway, and there’s reason to be concerned that his surgically repaired right shoulder (operated on in October 2010) will never allow him to reach those heights again. With $106 million still due over the next five years, he’ll have to age gracefully to maintain his value.

Carl Crawford
Crawford spent 33 days on the disabled list in June and July due to a hamstring strain but was still more productive this season than in his injury-plagued days in Boston. He hit .283/.329/.407 for a 108 OPS+ and 1.7 WAR, up from .260/.292/.419 for an 89 OPS+ and a combined 0.6 WAR in 2011 and ’12.

Still, that’s a far cry from the performance in Tampa Bay that induced the Sox to sign him for $142 million over seven years in December 2010. As a Ray he hit .306/.360/.473 for a 125 OPS+ in 2009 and ’10 while averaging 17 homers, 54 steals and 6.0 WAR. This year, he stole just 19 bases and homered six times, with his power disappearing after April; he hit just .276/.312/.378 from May through September before erupting for a .353/.421/.882 line and three home runs in the Division Series, with two of those homers coming in Monday’s clincher. Still, he’s due $82.5 million over the next four years and has a long way to go before he’ll be worth that money.

Nick Punto
Jokingly referred to as the centerpiece of “the Nick Punto trade,” the 35-year-old utilityman came in particularly handy this year for the injury-riddled Dodgers. He started 71 games at second base, shortstop and third base, hit .255/.328/.327 in 335 PA — his highest total since 2009 — slid into first base at every opportunity and shredded the jersey of any player who collected a walk-off hit for the Dodgers. His 2.2 WAR was more than enough to justify his $1.5 million salary, and was in fact his highest total since 2008. Given that, it would hardly be a shock if Los Angeles retains him once he reaches free agency this winter.

Josh Beckett
After getting lit up for a 5.23 ERA in 21 starts with Boston in 2012, Beckett showed signs of improvement upon moving to the Dodgers, posting a 2.93 ERA in seven late-season starts. Alas, he struggled mightily in 2013 and was torched for a 5.19 ERA and 1.7 homers per nine in eight starts through mid-May before going on the disabled list due to an irritated nerve.

It was ultimately discovered that he had Thoracic Outlet Syndrome, and he underwent surgery in July, shelving him for the year. Los Angeles will try to salvage some value out of him in 2014, when he’ll make $15.75 million in the final season of his four-year deal.

Boston

James Loney
After batting a dismal .254/.302/.344 in 359 PA for the Dodgers in 2012 — a continuation of his long downward slide from his 2007 rookie season — Loney skidded even more severely upon moving to Boston. Playing nearly every day in Gonzalez’s place over the final month of the season, he hit just .230/.264/.310 with two homers in 106 PA and finished the year with -1.1 WAR.

Not surprisingly, he received little attention upon hitting the free agent market, but the budget-minded Rays signed him for $2 million and got a strong return for their money. Even though he tailed off considerably in the second half, the 29-year-old lefty finished at .299/.348/.430; his 13 homers and 118 OPS+ matched his best marks since ’07, while his 2.7 WAR was a career high. His .351/.404/.486 line in 307 PA away from Tropicana Field made for the highest road batting average of any player with at least 200 PA.

In addition to flashing the leather in impressive fashion, he went 6-for-16 with a pair of doubles in the Rays’ abbreviated postseason run. His fine season should help him draw considerably more interest this winter than he got last time around.

Allen Webster
The 49th-ranked prospect on Baseball America‘s Top 100 Prospects list and the talk of the Red Sox camp back in spring, the 23-year-old Webster spent the season shuttling between Triple-A Pawtucket and the majors, being recalled on four separate occasions and making a total of seven starts and one relief appearance. Alas, while he put up solid numbers at Pawtucket (3.60 ERA and 9.9 strikeouts per nine) and flashed mid-90s heat at the major league level, he struggled with his command and control when given the chance by the Sox and was knocked around for an 8.60 ERA and 2.1 homers per nine in 30 1/3 innings.

He still projects as a third or fourth starter at the major league level by virtue of three plus pitches (fastball, slider changeup), but he’s far from claiming a spot in Boston’s rotation.

Rubby De La Rosa
De La Rosa tantalized in 60 2/3 innings with the Dodgers in 2011 before needing Tommy John surgery, and he threw just 9 2/3 competitive innings in 2012, so 2013 was really a comeback year for him. Not surprisingly, he had control problems at Pawtucket, walking 5.4 per nine while striking out 8.5 en route to a 4.26 ERA in 80 1/3 innings, mostly as a short-stint starter. The 24-year-old righty made 11 appearances for the Red Sox totaling 11 1/3 innings, all in relief. That may be where his future lies given his high-90s heat, inconsistent secondary offerings and mechanical issues, but his ceiling is as a closer, so he may yet have a big major league impact.

Ivan De Jesus
A 25-year-old utility infielder at the time of the trade, De Jesus collected just eight plate appearances with the Red Sox after being dealt, and still had only 80 in his major league career by the end of last season, with a .205/.253/.247 line. The Sox sent him to Pittsburgh in December as part of the Joel Hanrahan/Mark Melancon trade, but he has yet to debut for the Pirates. He hit .319/.380/.457 in 345 plate appearances at Triple-A Indianapolis this year, but may be stuck with the “organizational depth” tag unless another team envisions a larger role.

Jerry Sands
Sands, an outfielder/first baseman who homered 26 times for the Dodgers’ Triple-A Albuquerque team in 2012, never got to play for the Red Sox, and like De Jesus, he was sent to Pittsburgh in the Hanrahan/Melancon trade. He didn’t get to play for the Pirates either. The now-26-year-old righty hit just .207/.311/.329 with seven homers in 397 PA at Indianapolis, making headlines only for a one-game suspension for going into the stands to confront a heckler in Toledo. He did miss time in June with a hamstring injury, and again like De Jesus, looks more like organizational depth than anything else.

Improving Search Rank by Optimizing Your Time to First Byte – Moz


Back in August, Zoompf published newly uncovered research findings examining the effect of web performance on Google’s search rankings. Working with Matt Peters from Moz, we tested the performance of over 100,000 websites returned in the search results for 2000 different search queries. In that study, we found a clear correlation between a faster time to first byte (TTFB) and a higher search engine rank. While it could not be outright proven that decreasing TTFB directly caused an increasing search rank, there was enough of a correlation to at least warrant some further discussion of the topic.


The TTFB metric captures how long it takes your browser to receive the first byte of a response from a web server when you request a particular website URL. In the graph captured below from our research results, you can see websites with a faster TTFB in general ranked more highly than websites with a slower one.



We found this to be true not only for general searches with one or two keywords, but also for “long tail” searches of four or five keywords. Clearly this data showed an interesting trend that we wanted to explore further. If you haven’t already checked out our prior article on Moz, we recommend you check it out now, as it provides useful background for this post: How Website Speed Actually Impacts Search Ranking.


In this article, we continue exploring the concept of Time to First Byte (TTFB), providing an overview of what TTFB is and steps you can take to improve this metric and (hopefully) improve your search ranking.

What affects TTFB?


The TTFB metric is affected by 3 components:

  1. The time it takes for your request to propagate through the network to the web server
  2. The time it takes for the web server to process the request and generate the response
  3. The time it takes for the response to propagate back through the network to your browser.


To improve TTFB, you must decrease the amount of time for each of these components. To know where to start, you first need to know how to measure TTFB.

Measuring TTFB


While there are a number of tools to measure TTFB, we’re partial to an open source tool called WebPageTest.


Using WebPageTest is a great way to see where your site performance stands, and whether you even need to apply energy to optimizing your TTFB metric. To use, simply visit http://webpagetest.org, select a location that best fits your user profile, and run a test against your site. In about 30 seconds, WebPageTest will return you a “waterfall” chart showing all the resources your web page loads, with detailed measurements (including TTFB) on the response times of each.


If you look at the very first line of the waterfall chart, the “green” part of the line shows you your “Time to First Byte” for your root HTML page. You don’t want to see a chart that looks like this:


bad-waterfall


In the above example, a full six seconds is getting devoted to the TTFB of the root page! Ideally this should be under 500 ms.


So if you do have a “slow” TTFB, the next step is to determine what is making it slow and what you can do about it. But before we dive into that, we need to take a brief aside to talk about “Latency.”

Latency


Latency is a commonly misunderstood concept. Latency is the amount of time it takes to transmit a single piece of data from one location to another. A common misunderstanding is that if you have a fast internet connection, you should always have low latency.


A fast internet connection is only part of the story: the time it takes to load a page is not just dictated by how fast your connection is, but also how FAR that page is from your browser. The best analogy is to think of your internet connection as a pipe. The higher your connection bandwidth (aka “speed”), the fatter the pipe is. The fatter the pipe, the more data that can be downloaded in parallel. While this is helpful for overall throughput of data, you still have a minimum “distance” that needs to be covered by each specific connection your browser makes.


The figure below helps demonstrate the differences between bandwidth and latency.


latency


As you can see above, the same JPG still has to travel the same “distance” in both the higher and lower bandwidth scenarios, where “distance” is defined by two primary of factors:

  1. The physical distance from A to B. (For example, a user in Atlanta hitting a server in Sydney.)
  2. The number of “hops” between A and B, since internet traffic redirects through an increasing number of routers and switches the further it has to travel.


So while higher bandwidth is most definitely beneficial for overall throughput, you still have to travel the initial “distance” of the connection to load your page, and that’s where latency comes in.


So how do you measure your latency?

Measuring latency and processing time


The best tool to separate latency from server processing time is surprisingly accessible: ping.


The ping tool is pre-installed by default on most Windows, Mac and Linux systems. What ping does is send a very small packet of information over the internet to your destination URL, measuring the amount of time it takes for that information to get there and back. Ping uses virtually no processing overhead on the server side, so measuring your ping response times gives you a good feel for the latency component of TTFB.


In this simple example I measure my ping time between my home computer in Roswell, GA and a nearby server at www.cs.gatech.edu in Atlanta, GA. You can see a screenshot of the ping command below:


ping


Ping continued to test the average response time of the server, and summarized an average response time of 15.8 milliseconds. Ideally you want your ping times to be under 100ms, so this is a good result. (but admittedly the distance traveled here is very small, more about that later).


By subtracting the ping time from your overall TTFB time, you can then break out the network latency components (TTFB parts 1 and 3) from the server back-end processing component (part 2) to properly focus your optimization efforts.

Grading yourself


From the research shown earlier, we found that websites with the top search rankings had TTFB as low as 350 ms, with the higher ranking sites pushing up to 650 ms. We recommend a total TTFB of 500ms or less.


Of that 500ms, a roundtrip network latency of no more than 100ms is recommended. If you have a large number of users coming from another continent, network latency may be as high as 200ms, but if that traffic is important to you, there are additional measures you can take to help here which we’ll get to shortly.


To summarize, your ideal targets for your initial HTML page load should be:

  1. Time to First Byte of 500 ms or less
  2. Roundtrip network latency of 100 ms or less
  3. Back-end processing of 400 ms or less


So if your numbers are higher than this, what can you do about it?

Improving latency with CDNs


The solution to improving latency is pretty simple: Reduce the “distance” between your content and your visitors. If your servers are in Atlanta, but your users are in Sydney, you don’t want your users to request content half way around the world. Instead, you want to move that content as close to your users as possible.


Fortunately, there’s an easy way to do this: move your static content into a Content Delivery Network (CDN). CDNs automatically replicate your content to multiple locations around the world, geographically closer to your users. So now if you publish content in Atlanta, it will automatically copy to a server in Syndey from which your Australian users will download it. As you can see in the diagram below, CDNs make a considerable difference in reducing the distance of your user requests, and hence reduce the latency component of TTFB:


640px-NCDN_-_CDN


To impact TTFB, make sure the CDN you choose can cache the static HTML of your website homepage, and not just dependent resources like images, javascript and CSS, since that is the initial resource the google bot will request and measure TTFB.


There are a number of great CDNs out there including Akamai, Amazon Cloudfront, Cloudflare, and many more.

Optimizing back-end infrastructure performance


The second factor in TTFB is the amount of time the server spends processing the request and generating the response. Essentially the back-end processing time is the performance of all the other “stuff” that makes up your website:

  • The operating system and computer hardware which runs your website and how it is configured
  • The application code that’s running on that hardware (like your CMS) as well as how it is configured
  • Any database queries that the application makes to generate the page, how many queries it makes, the amount of data that is returned, and the configuration of the database


How to optimize the back-end of a website is a huge topic that would (and does) fill several books. I can hardly scratch the surface in this blog post. However, there are a few areas specific to TTFB that I will mention that you should investigate.


A good starting point is to make sure that you have the needed equipment to run your website. If possible, you should skip any form of “shared hosting” for your website. What we mean by shared hosting is utilizing a platform where your site shares the same server resources as other sites from other companies. While cheaper, shared hosting passes on considerable risk to your own website as your server processing speed is now at the mercy of the load and performance of other, unrelated websites. To best protect your server processing assets, insist on using dedicated hosting resources from your cloud provider.


Also, be wary of virtual or “on-demand” hosting systems. These systems will suspend or pause your virtual server if you have not received traffic for a certain period of time. Then, when a new user accesses your site, they will initiate a “resume” activity to spin that server back up for processing. Depending on the provider, this initial resume could take 10 or more seconds to complete. If that first user is the Google search bot, your TTFB metric from that request could be truly awful.

Optimize back-end software performance


Check the configuration of your application or CMS. Are there any features or logging settings that can be disabled? Is it in a “debugging mode?” You want to get rid of nonessential operations that are happening to improve how quickly the site can respond to a request.


If your application or CMS is using an interpreted language like PHP or Ruby, you should investigate ways to decrease execution time. Interpreted languages have a step to convert them into machine understandable code which what is actually executed by the server. Ideally you want the server to do this conversion once, instead of with each incoming request. This is often called “compiling” or “op-code caching” though those names can vary depending on the underline technology. For example, with PHP you can use software like APC to speed up execution. A more extreme example would be Hip Hop, a compiler created and used by Facebook that converts PHP into C code for faster execution.


When possible, utilizing server-side caching is a great way to generate dynamic pages quickly. If your page is loading content that changes infrequently, utilizing a local cache to return those resources is a highly effective way in improving the performance of your page load time.


Effective caching can be done at different levels by different tools and are highly dependent on the technology you are using for the back-end of your website. Some caching software only cache one kind of data, while others do caching at multiple levels. For example, W3 Total Cache is a WordPress plug-in that does both database query caching as well as page caching. Batcache is a WordPress plug-in created by Automattic that does whole page caching. Memcached is a great general object cache that can be used for pretty much anything, but requires more development setup. Regardless of what technology you use, finding ways to reduce the amount of work needed to create the page by reusing previously created fragments can be a big win.


As with any software changes you’d make, make sure to continually test the impact to your TTFB as you incrementally make each change. You can also use Zoompf’s free performance report to identify back-end issues which are effecting performance, such as not using chunked encoding and much more.

Conclusions


As we discussed, TTFB has 3 components: the time it takes for your request to propagate to the web server; the time it takes for the web server to process the request and generate the response; and the time it takes for the response to propagate back to your browser. Latency captures the first and third components of TTFB, and can be measured effectively through tools like WebPageTest and ping. Server processing time is simply the overall TTFB time minus the latency.


We recommend a TTFB time of 500 ms or less. Of that TTFB, no more than 100 ms should be spent on network latency, and no more than 400 ms on back-end processing.


You can improve your latency by moving your content geographically closer to your visitors. A CDN is a great way to accomplish this as long as it can be used to serve your dynamic base HTML page. You can improve the performance of the back-end of your website in a number of ways, usually through better server configuration and caching expensive operations like database calls and code execution that occur when generating the content. We provide a free web performance scanner that can help you identify the root causes of slow TTFB, as well as other performance-impacting areas of your website code, at http://zoompf.com/free.

BarbaraRush4HelenaSchoolBoard: SHORT SIGHTED

If someone came up to you with a product that was completely untested and claimed that it would prepare your children for the 21st century and that they would be college and career ready if they used it ,would you buy it without asking any questions? Would you be willing to throw out all your tried and tested products and replace them with the new product even if you knew you had to sign a contract promising not to make any changes to the new product no matter what? I’ll bet you would have some questions and would be leery of making whole sale replacements with things you had no power to change. Well, guess what? Mike Huckabee thinks you’re SHORT SIGHTED!! Yes, it’s not the Common Core hucksters and frauds that are shortsighted, it is you. I like Mike and agree with him on many things, but come on Mike, think a little. What do you really know about education or the program at issue here? Maybe you should get some information before you call those people trying to save America “SHORT SIGHTED”.

Isn’t it just alarming that the left can come up with just any sort of nonsense and get credit for being progressive? How many fads have your seen come and go in your life time? The very people that are burning our forests down and trying to stop the fracking that will give us energy independence get credit for being “progressive”. What are the rest of us that want tried and tested good education for our students that we have control over? Are we all SHORT SIGHTED?

Just maybe, people that make stupid comments and don’t know the facts are the people who are SHORT SIGHTED. I taught for thirty years and I tried all the project learning. It doesn’t work. Kids spend a lot of time on a limited amount of information and there is no real way to know who learned anything from the experience. This country was built on direct instruction- including the astronauts who sent us to the moon and brought home Apollo Thirteen. Higher level thinking skills always sounds good – but any fool knows you need to have some information first- and young students need an ocean of foundational information before they can even begin to read and write effectively. Wake up Mike- I think you are being dangerously SHORT SIGHTED here.

The “progressives” have brought us fiscal disaster and cultural implode, how in the world can you expect that what they’re now making billions of dollars on, without any trials or tests to prove it works, will improve our schools? Let’s start looking at what will really help our students learn the best and the things that will really make this country great. We know direct instruction works. We know we need better teacher training- like making teachers experts at their grade level. We know free market capitalism works and preparing students to be responsible hard workers will give them the opportunities that free markets and freedom itself offers. We know states can do their own education as they always have. We know we’re doing better than Russia, France, Singapore- pick a country- we hardly need to look to these countries to tell us how to educate our children.

Save America and save our schools by giving the progressive traitors the boot. We will not do better in our schools until we really want hard working individuals to strengthen our free market capitalist system. When we really want our children to find success in our free markets we will prepare them for that.

Abbott is hiding from the future | Victoria Rollison

Future-signMy definition of maturity is the ability to be resolute in doing something challenging now which will improve our lives in the future. Unfortunately for future Australians, Abbott’s Liberal National government is completely lacking in this maturity. In opposition, Abbott’s team spent six years bashing Labor for taking on challenges to improve our future. In government, they have advocated a ‘me, me, me, now, now, now’ approach to their policy agenda, appealing to the electorate’s most selfish, short-sighted, immature instincts.

Abbott is no doubt pleased today to find that this approach has got him into power. Becoming PM was all he ever wanted. But what about the future? Whether he likes it or not, while he stands still and looks back, time rolls on towards the future. The problem for Abbott is he only ever thought as far as the election. As far as the moment where he could display his daughters dressed in white and declare himself Prime Minister. But what next? Next, I’m interested to see how Australia’s future will judge the Abbott government. A future he apparently gave no thought to.

An immature government that only thinks of itself and seems incapable of worrying about anything that might happen more than a week in advance is a very dangerous government. Josh Bornstein suggested in the Guardian this week that Abbott won the support of the electorate by scaring them into believing Australia was facing many crises. But these crises were concocted. I think he makes a good point. Unfortunately the Labor Government failed not only to play down Abbott’s boy-who-cried-wolf-claims, but even backed some of them up by going along with the idea that there was a cost of living crisis (when there wasn’t). On top of this, Labor failed to back up its own economic credentials in delivering an economic success story, not a crisis. And Labor also failed to persuade the easily frightened electorate of the long term benefits of the Carbon Price, while Abbott successfully persuaded them of the short term costs (which didn’t eventuate). Of course, it wasn’t exactly easy for Labor to get their positive message across, given the barrage of publicity Abbott was gifted from every news outlet in the country, including the ABC, while so-called-journalists offered zero scrutiny of Abbott’s messages of doom. The media loves a crisis, whether the crisis exists or not.

The ridiculous and tragic part of this tale of ‘crisis’ propagated by Abbott is that there is a real crisis looming. Climate change. Yet Abbott convinced people who were all too willing to be deceived, that climate change is just a big over-reaction by alarmists, and that the Carbon Price hit on their electricity bills was the only thing they had to fear.

But what now? The future is still coming, and climate change isn’t going away. Abbott is still promising to replace Labor’s Carbon Price with the unpopular Direct Action policy – the world’s most expensive government tree planting exercise, which no expert has been able to prove will have any discernable impact of Australia’s carbon emissions. Labor’s Carbon Price is reducing emissions, and Abbott is scrapping it, without even explaining first how, logistically, his government will plant 20 million trees with a 15,000 strong ‘Green Army’. This lack of foresight into the future is going to become a huge political mess for Abbott. And covering it all up isn’t going to help either. Not when the Climate Commission is now an independently funded Climate Council, which is dedicated to keeping reports like the latest climate predictions from the IPCC front and centre in the community’s mind. Abbott might like to think his buddies in the media will cover up climate change for him, but what happens when more and more engaged Australians flood to social media, independent Australian media and international press to find out the truth for themselves? What happens when even the doubters and deniers start to notice temperature records being broken on a monthly basis? When the predicted sea rises start to affect beachside property in Sydney, and not just small islands which are currently out of sight and out of mind? Abbott can’t hide from the future forever.

It’s actually difficult to find a policy area where Abbott and his colleagues have given any thought to the future. But what happens when they have to come face to face with this future? A future just around the corner?

In 1943, Thomas Watson, the chairman of IBM, said “I think there is a world market for maybe five computers”. This is the sort of future thinking that Malcolm Turnbull deployed when he said “25 megabits per second will enable anybody in residential situations to do everything they want to do or need to do in terms of applications and services”. He should have added another word to that sentence. Today. But what about tomorrow Turnbull? Wouldn’t the best idea be to future-proof Australia’s broadband network, so that it doesn’t rely on old rotting copper infrastructure? Shouldn’t you look to the future and expect megabit needs to increase exponentially as technology improves? Or is the future not of your concern?

This week Pyne floated the suggestion that he will again cap university places, and he would also like to see the removal of the student amenities fee. (Abbott has backed quickly away, although it’s clear the policy change is still on the table). Characteristically, Pyne explained his concerns were that Labor’s demand driven tertiary education sector (in other words, a huge increase in the number of Australians with a tertiary education, many first in their family to gain a degree) was going to have a negative influence on the ‘quality’ of Australia’s tertiary education sector.

Apart from the fact that Pyne is breaking a pre-election promise to not re-introduce caps, it’s clear his perception of a ‘quality’ education is a ‘scarce’ education. Like a Porsche owner bemoaning the number of other Porsches he sees on the road, as evidence of the lack of ‘status’ accorded to him by spending a small fortune on a car. Pyne doesn’t want just ‘anybody’ to have a tertiary education. Especially not those who are first in their family. No, only the privileged few should have access to a quality education, presumably to maintain their privilege and to squash social mobility and aspiration. But this sort of thinking reveals the lack of foresight Pyne has about the benefits to Australia of lifting the number of the population with tertiary degrees. Pyne needs to understand that the point of a university degree is not to add a qualification to your resume, to frame a piece of paper on your wall, a piece of paper lots of other people don’t have. The point of educating more Australians is to have more educated Australians. To have a highly skilled population. To improve productivity. To increase innovation. To better Australia’s competitive advantage against other developed economies with ever increasing numbers of educated adults. But this is a future goal, something Pyne obviously cares little about.

And what about Abbott and Bishop claiming for years that they can turn around asylum seeker boats and send them back to Indonesia? Did they consider what they might do in the eventuality where the Indonesian government won’t have a bar of this reckless plan? Or did they just think they’d worry about that later? Later is here. Or look at Abbott’s cutting of Australian Research Council funding, which will stifle Australia’s future scientific advancements and strangle the economic benefits of a high-tech economy. How about Abbott’s preferences for road infrastructure, over funding for renewable energy technologies to eventually replace polluting vehicles. It’s all about now and what Abbott thinks he needs to do now, to win power now. Abbott cares only about himself. He doesn’t even seem to care about his daughters’ futures. It’s fairly clear even the election in 2016 is way too far away to have any effect on Abbott’s current behaviour. This is good news for those hoping for a #OneTermTony. The future is not going to look kindly on Abbott’s government. Abbott should have thought about this. But thankfully, he and his team don’t have the maturity, nor the intelligence, to notice.

div { margin-top: 1em; } #google_ads_div_wpcom_below_post_adsafe_ad_container { display: block !important; }
]]>


Wait 'Til Next Year: Kansas City Royals | The Strike Zone – SI.com

Justin Maxwell, Royals

Justin Maxwell’s game-winning grand slam last Sunday was one of this season’s many highlights in Kansas City. (Charlie Riedel/AP)

While so much of our day-to-day attention in this space is devoted to the teams still battling for playoff spots, we feel as though it’s only fitting to acknowledge the teams that have been mathematically eliminated from contention, giving them a brief sendoff that should suffice until Hot Stove season. Thus, the Wait ‘Til Next Year series.

Current Record: 84-75 (.528, third in AL Central)

Mathematically Eliminated: Sept. 25

What went right in 2013:

The Royals ensured themselves of their first winning record since 2003 and just their second since the 1994 players’ strike; with one more win, they can claim the franchise’s best season since 1989, when they won 92 games. Thanks to the two wild-card format and some late-season resiliency — including the league’s second-best record (41-26) since the All-Star break — they brought meaningful September baseball to Kansas City, rebounding from an 8-20 May to remain in contention until midway through the season’s final week. For all of manager Ned Yost’s tactical flaws, his ability to keep the Royals climbing off the mat when they could have thrown in the towel isn’t nothing.

Kansas City’s modest success owed mainly to having the league’s best run-prevention (3.73 runs per game). Starters James Shields (3.21 ERA in 221 2/3 innings, with a league-high 26 quality starts) and Ervin Santana (3.24 ERA in 211 innings with 23 quality starts), both acquired in offseason trades, have put together strong years and routinely kept the Royals in games even if they did receive meager run support. Third starter Jeremy Guthrie has eaten 211 2/3 innings with a 4.04 ERA, and mainstay Bruce Chen has delivered a 3.79 ERA in 14 starts after spending the first half of the season in the bullpen. As a unit, the rotation ranks fifth in the league in ERA (3.88) and third in quality start rate (58 percent), a dramatic improvement over 2012, when it ranked 11th in ERA (5.01) and 13th in quality start rate (43 percent).

The bullpen, which had ranked fourth in ERA in 2012 (3.17), has improved to a league-best 2.56. Closer Greg Holland (1.23 ERA, 13.8 K/9, 46 saves) was outstanding, setup men Aaron Crow and Tim Collins were very good, and failed overall number one pick Luke Hochevar reinvented himself via a 1.95 ERA and 10.4 strikeouts per nine, making his previous experience as a starter (5.44 ERA, 6.2 K/9 in) look like six years of head-beating against a wall.

The offense, which ranks 11th in the league in scoring (3.98 runs per game), 13th in slugging (.377) and dead last in homers (107), has fewer happy stories. Billy Butler (.286/.374/.406 with 14 homers), Alex Gordon (.267/.328/.424 with 20 homers and 4.1 WAR) and Salvador Perez (.287/.319/.419 with 11 homers and 3.8 WAR) have kept their heads above water even if they’ve fall short of high expectations. The real success has been Eric Hosmer, who after a dismal 2012 and a .261/.320/.333 line through May, has hit a sizzling .318/.368/.495 since to lift his overall line to .301/.354/.448/ with 17 homers. Deadline acquisition Justin Maxwell has hit .292/.373/.551 with five homers in 32 games, including a walkoff grand slam against the Rangers on Sept. 22 that may stand as the season’s high point.

What went wrong in 2013:

On the offensive side, Mike Moustakas (.232/.285/.358, 11 homers, −0.2 WAR) has had a terrible season and looks to be even further in the weeds than Hosmer was after last year. Alcides Escobar (.234/.259/.301, 0.4 WAR) had by far his worst season with the bat, one so bad it offset his good glovework. Second base was a proverbial Gaping Vortex of Suck, with Chris Getz, the since-released Elliot Johnson, the since-suspended Miguel Tejada, late season pickup Emilio Bonifacio et al hitting a combined .230/.282/.353. Even with the work of Hosmer, the KC infield combined for the league’s second-lowest OPS at 661, via a .253/.300/.361 line.

The Royals’ offensive problems weren’t confined to the infield; the outfield as a whole hit .263/.318/.396 for the league’s third-worst OPS. Jeff Francoeur (.208/.249/.322) did nothing to merit regular playing time — let alone offset the absence of the traded Wil Myers — before drawing his release in early July. Lorenzo Cain, Jarrod Dyson and David Lough all hit worse than league average, though they offset that with a combined 44 Defensive Runs Saved.

As for the pitching, Wade Davis, who was acquired along with Shields in the Wil Myers blockbuster, was rocked for a 5.40 ERA in 24 starts and six relief appearances, and Luis Mendoza, who made 15 starts and seven relief appearances, was battered for a 5.36 ERA himself. With even slightly better starters taking the ball roughly one-quarter of the time, the Royals might have come away with a wild card spot.

Overall outlook:
In his eighth year at the helm, general manager Dayton Moore went for broke in an effort to turn the Royals into a playoff team, and he nearly succeeded thanks to a much-improved rotation and a stellar bullpen. However, trading six years of Myers, the 2012 Baseball America Minor League Player of the Year and now a solid AL Rookie of the Year candidate for two years of Shields was a short-sighted move that looks more costly given that so many of the teams’ homegrown, affordable hitters have underwhelmed.

The good news is that by finishing with a winning record, the organization’s credibility has been enhanced significantly. Depending upon whether ownership is willing to continue loosening the purse strings, the Royals will have a much better chance at attracting desirable midlevel free agents who might have shunned them before. It’s not out of the question that Santana, for example, could be convinced to stay with a competitive multiyear deal.

The bad news is that a near-miss for a wild-card spot doesn’t mean all that much in the grand scheme of things. Going into the final two weeks of the season with a 4.3 percent chance of making it into a coin-flip game is a low bar to clear, one that doesn’t justify trading the organization’s best prospect, particularly when he would have helped an impoverished offense. That trick won’t work again; the Royals have to continue to improve and to challenge for division titles with a wild-card spot as a fallback, not as a longshot hope. They can’t settle for yet another year of no production from Moustakas, their rightfielders, or especially their second baseman,  and they need to recognize the limitations of Gordon and Butler as building blocks — not to mention the limitations of Moore as an architect and Yost (whose contract is up) as skipper.

There’s much more work to be done, but Kansas City at last is moving in the right direction.

Where We Could Really Use the "Next Steve Jobs" | The Stopped …

A lot of people focus on the smartphone market, and complain with each new Apple product that… Steve Jobs would have done something different, or better, or both. Steve Jobs brought something unusual to Apple, specifically a willingness to make huge gambles on theoretical technology, and to release products that could turn out to be failures. Apple seems to have become exceedingly cautious, but I’m not sure that is so much the result of a change in the company’s philosophy as it is a change in consumer expectations. The iPhone 4 antenna issue, and the Apple Maps brouhaha, suggest that consumers want nothing less than perfect and, rather than launching risky products that might inspire a mixed reaction or turn out to be the next Newton (or Zune), caution has spread across the industry.

The real story behind the focus on portable electronics is not so much that a life-changing innovation is just around the corner. It’s much more that there is profit in the upper end of the market, the mass market having already been commoditized. Smartphone advances reflect the importance of competition as, even though Apple sees the rise and fall of Nokia as a cautionary tale, history suggests that product development in a commoditized market tends to be slow. Most companies see little to no point in spending hundreds of millions of dollars to marginally improve a product that will likely sell at the same price point as before. That’s the sort of context in which a short-sighted CEO of a company like Hewlett-Packard might decide that it no longer makes sense to fund research that is not directly aimed at turning a profit, or why a similarly short-sighted company’s products might go from excellent to “good enough” in order to increase margins by decreasing production costs. (Am I talking about the same company?)

One might argue that televisions have seen marked advances in technology despite being a largely commoditized market, but that has been driven in no small part by the introduction of HDTV and the money poured into the development of new displays for computer users and commercial settings. Even in that context, major players like Panasonic have a very difficult time turning a profit, and the pool of companies that produce television displays and sets is not expanding.

One area that has seen a surprising lack of innovation is the desktop computer market. That’s in part because it’s a tough nut to crack – computers do pretty much what we want them to do, there are no obvious ways to dramatically improve the user interface, and the technologies for interacting with computers other than through a mouse and keyboard tend to focus on niche users or turn out to be largely impractical. It may be that one day we’ll have displays and “no touch” gesture controls as shown in the film, “Minority Report”, but that’s not on the horizon. Basically, the desktop computer market seems a lot like the television market. To the extent that incremental improvements are seen, they’re in no small part the result of R&D in the mobile marketplace. The biggest “innovation” we’ve seen in a desktop operating system was Microsoft’s annoying, clumsy interference with the user experience by putting a “smart tile” display between the computer user and the desktop – that is, they tried to make the desktop experience more like mobile, never mind whether that makes sense. Apple has made similar, albeit less in-your-face changes to its desktop operating system, with its Launchpad and App store, but they’re really not part of the ordinary desktop experience.

Somebody commented to me recently that Apple seemed to be “giving up” on the competition for desktop computers. I responded that they’re chasing money and market share, and that right now they can find both in the mobile space while there is little incentive to try to claw out a greater market share in the desktop market. The cost of significantly expanding their desktop presence would be significant, and there’s really not much money to be made in that market. Were Apple to start producing $300 – $600 portable computers it might find a market, but it would have to make the quality cuts that are readily apparent in computers in that price range, potentially costing it brand loyalty over the long run in the same manner that the low quality Apple products of the Sculley era damaged Apple’s reputation and competitiveness. Why mass produce low-cost computers that have to be sold at tiny margins and that would likely have an impaired user experience, when you can continue to sell $1000+ computers that people enjoy using, and sell millions of highly profitable iPads to the sub-$1,000 market?

Really, though, the desktop industry needs to be woken from its complacency, much in the manner that Google and Apple rebooted then-stagnant browser development with Chrome and Safari. The problem being, you either need a company that sees a long-term gain in developing new technology at a significant short-term cost, the way Xerox PARC laid the foundation for the computer mouse and windows-driven displays, or because they don’t want to be indentured to a competitor’s product. And if you take the HP Labs / Xerox PARC approach, you also need a visionary who can see how a new idea can be improved and put into widespread use – after all it was Apple, not Xerox, that turned the mouse and menu/windows-driven interface from an impractical lab-based demo to the desktop standard.

The manner in which the world, and Apple, has changed is perhaps best illustrated by today’s quiet announcement that the iMac has been updated. You can go to the Apple Store and buy one today – but the new version isn’t even flagged as “new”. A secondary illustration comes from the Mac Pro, the high-end computer Apple develops for the professional market, which is soon to be released in an innovative new case. But that’s innovation in the same sense as the Mac Mini was an innovation – great design and packaging, but nothing you couldn’t have accomplished in a traditional mini tower case. Apple did promote the redesigned Mac Pro, some months back, but when will it actually come to market? Later this year. There’s no sense of urgency, as there is in the highly competitive mobile marketplace.

An argument can be made that when a technology reaches a certain point of maturity, all new developments will be incremental. Perhaps the keyboard and mouse-driven desktop computer are pretty much it – and unless the entire concept is reinvented (much as the iPhone reinvented the smartphone market) this is it. People seem disappointed when the new “state of the art” smartphone looks like the old one – as if there’s a great deal you can do to differentiate the hardware of a typical smartphone in ways that are obvious or exciting. Even in that market, unless a new, disruptive technology comes along the biggest future changes will come through software. In fifteen years, today’s typical smartphone and tablet apps are likely to look about as sophisticated as Pong. But still, it would be nice to have a sense that somebody out there – somebody positioned to disrupt the market – was looking at “impractical, unworkable” new ideas from a different angle, and asking, “What if….”

Where We Could Really Use the "Next Steve Jobs" | The Stopped …

A lot of people focus on the smartphone market, and complain with each new Apple product that… Steve Jobs would have done something different, or better, or both. Steve Jobs brought something unusual to Apple, specifically a willingness to make huge gambles on theoretical technology, and to release products that could turn out to be failures. Apple seems to have become exceedingly cautious, but I’m not sure that is so much the result of a change in the company’s philosophy as it is a change in consumer expectations. The iPhone 4 antenna issue, and the Apple Maps brouhaha, suggest that consumers want nothing less than perfect and, rather than launching risky products that might inspire a mixed reaction or turn out to be the next Newton (or Zune), caution has spread across the industry.

The real story behind the focus on portable electronics is not so much that a life-changing innovation is just around the corner. It’s much more that there is profit in the upper end of the market, the mass market having already been commoditized. Smartphone advances reflect the importance of competition as, even though Apple sees the rise and fall of Nokia as a cautionary tale, history suggests that product development in a commoditized market tends to be slow. Most companies see little to no point in spending hundreds of millions of dollars to marginally improve a product that will likely sell at the same price point as before. That’s the sort of context in which a short-sighted CEO of a company like Hewlett-Packard might decide that it no longer makes sense to fund research that is not directly aimed at turning a profit, or why a similarly short-sighted company’s products might go from excellent to “good enough” in order to increase margins by decreasing production costs. (Am I talking about the same company?)

One might argue that televisions have seen marked advances in technology despite being a largely commoditized market, but that has been driven in no small part by the introduction of HDTV and the money poured into the development of new displays for computer users and commercial settings. Even in that context, major players like Panasonic have a very difficult time turning a profit, and the pool of companies that produce television displays and sets is not expanding.

One area that has seen a surprising lack of innovation is the desktop computer market. That’s in part because it’s a tough nut to crack – computers do pretty much what we want them to do, there are no obvious ways to dramatically improve the user interface, and the technologies for interacting with computers other than through a mouse and keyboard tend to focus on niche users or turn out to be largely impractical. It may be that one day we’ll have displays and “no touch” gesture controls as shown in the film, “Minority Report”, but that’s not on the horizon. Basically, the desktop computer market seems a lot like the television market. To the extent that incremental improvements are seen, they’re in no small part the result of R&D in the mobile marketplace. The biggest “innovation” we’ve seen in a desktop operating system was Microsoft’s annoying, clumsy interference with the user experience by putting a “smart tile” display between the computer user and the desktop – that is, they tried to make the desktop experience more like mobile, never mind whether that makes sense. Apple has made similar, albeit less in-your-face changes to its desktop operating system, with its Launchpad and App store, but they’re really not part of the ordinary desktop experience.

Somebody commented to me recently that Apple seemed to be “giving up” on the competition for desktop computers. I responded that they’re chasing money and market share, and that right now they can find both in the mobile space while there is little incentive to try to claw out a greater market share in the desktop market. The cost of significantly expanding their desktop presence would be significant, and there’s really not much money to be made in that market. Were Apple to start producing $300 – $600 portable computers it might find a market, but it would have to make the quality cuts that are readily apparent in computers in that price range, potentially costing it brand loyalty over the long run in the same manner that the low quality Apple products of the Sculley era damaged Apple’s reputation and competitiveness. Why mass produce low-cost computers that have to be sold at tiny margins and that would likely have an impaired user experience, when you can continue to sell $1000+ computers that people enjoy using, and sell millions of highly profitable iPads to the sub-$1,000 market?

Really, though, the desktop industry needs to be woken from its complacency, much in the manner that Google and Apple rebooted then-stagnant browser development with Chrome and Safari. The problem being, you either need a company that sees a long-term gain in developing new technology at a significant short-term cost, the way Xerox PARC laid the foundation for the computer mouse and windows-driven displays, or because they don’t want to be indentured to a competitor’s product. And if you take the HP Labs / Xerox PARC approach, you also need a visionary who can see how a new idea can be improved and put into widespread use – after all it was Apple, not Xerox, that turned the mouse and menu/windows-driven interface from an impractical lab-based demo to the desktop standard.

The manner in which the world, and Apple, has changed is perhaps best illustrated by today’s quiet announcement that the iMac has been updated. You can go to the Apple Store and buy one today – but the new version isn’t even flagged as “new”. A secondary illustration comes from the Mac Pro, the high-end computer Apple develops for the professional market, which is soon to be released in an innovative new case. But that’s innovation in the same sense as the Mac Mini was an innovation – great design and packaging, but nothing you couldn’t have accomplished in a traditional mini tower case. Apple did promote the redesigned Mac Pro, some months back, but when will it actually come to market? Later this year. There’s no sense of urgency, as there is in the highly competitive mobile marketplace.

An argument can be made that when a technology reaches a certain point of maturity, all new developments will be incremental. Perhaps the keyboard and mouse-driven desktop computer are pretty much it – and unless the entire concept is reinvented (much as the iPhone reinvented the smartphone market) this is it. People seem disappointed when the new “state of the art” smartphone looks like the old one – as if there’s a great deal you can do to differentiate the hardware of a typical smartphone in ways that are obvious or exciting. Even in that market, unless a new, disruptive technology comes along the biggest future changes will come through software. In fifteen years, today’s typical smartphone and tablet apps are likely to look about as sophisticated as Pong. But still, it would be nice to have a sense that somebody out there – somebody positioned to disrupt the market – was looking at “impractical, unworkable” new ideas from a different angle, and asking, “What if….”

Retailers that ignore, discourage showrooming 'short-sighted …

The average shopper age between 16 and 24 will wait in line at a checkout for just six minutes before giving up and leaving, according to previous research by omnichannel retail experts Omnico. Among older shoppers that wait time tolerance drops to under five minutes.

New findings from Omnico reveal an even more worrying trend for High Street retailers.

One in 10 shoppers has made a purchase, via mobile, from another retailer’s website while standing in a checkout queue. Among younger age groups, that figure rises to 15%.

This mobile behaviour is all the more reason for retailers to embrace omnichannel retail in-store, says Steve Thomas, chief technology officer at Omnico. The “natural urge” is for retailers to attempt to stop shoppers from using mobile to compare prices or inventory while in-store, but that’s a losing battle and one which will reflect badly on the store with consumers.

“Some retailers try to stop it, by ignoring consumers’ desire for free wi-fi or even blocking mobile signals, but this is a short sighted view,” said Thomas (via Internet Retailing). “Yes, price is very important, but there are many other factors that encourage loyalty to a brand and omni-channel, such as convenience and quality of service.”

Image via Shutterstock

Tags: mobile, omnichannel, retail trends, shopper insights, showrooming

5 great features in iOS 7 you'll want to try right away — Tech News …

As promised, Apple released iOS 7 on Wednesday for multiple iOS devices. The software represents a radical departure from the now 6-year old mobile platform, with a full face-lift of icons, fonts and features. Some of the changes are simply visual but there are a host of new functions that have impressed me over the past few months of using the iOS 7 beta.

This is by no means an exhaustive list of every iOS 7 feature, but a highlight of some that I feel were done well and are immediately useful. And before you upgrade, you may want to get your iOS device ready for the new software with these steps.

iTunes Radio

I mentioned earlier today that even though I generally use Google Android devices, iTunes Music has already found a place in my daily life. I’ll be canceling my Google Play Music All Access trial before the $9.99 monthly charge kicks in because I find iTunes Radio much better for me. Music tastes vary so your experience could differ, but iTunes Music won’t cost you anything to try; it’s free with iOS. You’ll find it in the iOS 7 Music app. Choose a preset station or create your own based on artists or songs you like.

iTunes Radio settings

An Improved Spotlight (if you can find it!)

Gone is the “swipe left from the home screen” action to search on your iOS device. Apple moved the search feature to above the home screen; sort of. If you pull down from the top of the screen, you’ll get the new Notification Center. So how do you get to search? Pull the home screen itself down and a Search bar will appear at the top of the screen. Enter a search term and iOS will scour your device for any matches in Apps, Mail, Contacts, Music, Calendar events, Videos and more. It’s not quite a true Universal Search because you can’t search the web from here. Still, it’s quite good; and fast on my iPhone 5.

spotlight iOS 7

Better application updating

In the App Store settings, you can now tell iOS to automatically update your apps as needed. That’s better than getting a numbered badge to indicate there are updates available. Don’t worry, the Updates section of the App Store is still there. And if you set iOS up for automatic updates, this section offers a history of what was updated — and when — automatically. Even better: Once an app is updated, a small blue dot will appear next to the app name on your home screen so you know you have a new version of the app.

Control Center

Yup, this feature may have been lifted from Android but it’s well worth it. Access the new Control Center by swiping up from the bottom of your display. Regardless of what app — or even if your phone is locked — the new Control Center pops right up offering quick access to highly useful functions: Airplane Mode, Wi-Fi, Bluetooth, Do Not Disturb and Rotation Lock. You can also control screen brightness, any music that’s playing or tap buttons for AirDrop or AirPlay. Lastly there are four icons for specific apps: Flashlight, Clock, Calculator and Camera.

command center iOS 7

Super app switching

It’s another “borrowed” feature — this time from webOS — but the multitasking interface is totally revamped and much better. Each open app is a card and you can see three at a time. The app cards show the app in its last state; they’re not actively updating. Slide these around and you can quickly see all of your open apps; tap one and you’ll jump right in. Want to close an app, or two or three? You can do that by dragging the app cards up and off the screen; up to three at a time if you want.

Apple says there are 200 new features in iOS 7, so there plenty more to explore and enjoy. This handful are among my favorite updates so far. It’s easy to think that the software is simply iOS 6 with a fresh coat of flat paint but I think that’s a short-sighted observation. The more I’ve used iOS 7, the more the update has impressed me.

Another Short-Sighted Politician – Guy Fawkes' blog – Guido Fawkes


Seen Elsewhere

Commentariat’s Analytical Derivatives Market | Mark Textor
Clegg, Invincible Loser of Politics | Boris Johnson
Read Guido’s Column Online | Sun
Freedom, Feminism and the Veil | Sarah Wollaston MP
More LibDem Sexual Harrassment Claims | Sky
Clegg is a Loser | Boris Johnson
Keep LibDems Out in 2015 | Iain Martin
Decline and Fall of Barack Obama | Tim Montgomerie
“No MP Salary Increase” E-Petition | David Holder
Spinning For God | Speccie
My Application to Be Nadine Dorries’ Daughter | Martha Gill




The Institute of Economic Affairs renames HS2

“The High-Speed Gravy Train”

There’s a theory that with a family member nearby Ugandan relations may be averted?