2015-04-12

2013 WRX Swaybar Installation

Introduction


My wife and I run a 2013 Subaru WRX hatchback at autocross and rallycross events. In an effort to improve handling and reduce tire wear, we decided to upgrade the front and rear swaybars and endlinks. We chose a 24mm two-position adjustable front swaybar and a 22mm three-position adjustable rear swaybar and adjustable front and rear endlinks, all from Whiteline. The rear swaybar comes with an additional support brace to provide a little added stiffness.

The equipment came from the good folks at Subie Autosport, who were more than happy to advise us on what would work best for our purpose. Two days after we ordered, the gear was on our doorstep.

This is the story of two people with limited mechanical ability attempting to do this installation. I have done a few oil changes and many wheel/tire swaps, but little else. My wife has done similar work on other cars but not the WRX yet.

I've read many forum posts on the subject, I've watched about a dozen videos, and I've read the included directions. My wife also read the directions and watched a couple of the install videos with me. Supposedly this is under an hour of work per swaybar; we gave ourselves the better part of a day to do it, given our collective inexperience.

Part I: Front Swaybar & Endlinks


Step 1: Remove the plastic undertray. This went smoothly.

Step 2: Remove the crossmember to gain access to the swaybar. This is where the trouble started. Ten bolts, all stuck like they were glued in. Doused everything in penetrating oil, eventually got all the bolts off using a breaker bar, which we had to run out and buy (I didn't own one). I think this is where I injured my shoulder - seems like a partially torn rotator cuff. Getting the crossmember off took about three hours, including giving the oil some time to sink in, and going to the auto parts store for the breaker bar.

Step 3: Remove the swaybar. Again, every bolt was seized. Doused them all with penetrating oil, let it sit. Got the nuts off the endlinks. Got the nuts off the D-brackets. Removing the bolts from the D-brackets, one of the bolts sheared. We tried for a while to remove the end bolt from the top but access is extremely poor. Did some research, and decided to try using an easy-out in the hope we wouldn't have to drill out and re-ream the bolt hole. We finally got the brackets off, disconnected the endlinks, and got the swaybar off. At this point it was 10pm, about five hours in to our two-hour project, and we don't own an easy-out. I loaded up on NSAIDs and iced my shoulder.

The next day, went to the hardware store and picked up an easy-out, and after a lot of drilling, finally got the bolt out. At this point my wife is doing most of the work, I'm mostly providing technical advice and the occasional third hand. Despite the package's claim of "10 seconds", this took us about an hour and the better part of an 18v drill battery. We also went to the auto parts store and got a replacement bolt.

Step 4: Remove the endlinks. Surprise, these bolts were stuck too. Penetrating oil and a lot of elbow grease finally got them off. It is incredibly difficult to hold an allen wrench while you wrench a nut off, with your hand inside the wheel.

Step 5: Install the new endlinks. The Whiteline endlinks come with nuts that have a plastic piece on the front side, which you usually see in parts where they use it as a stop so you can't over-tighten the nut. In this case, however, you're supposed to torque past it, which we found after some more research. Also, those red plastic things that are in every product picture? For protection during shipping. Remove them before installing - not mentioned in the instructions.

We get them torqued down and run out of depth in the socket. My box-end wrench set only goes up to 14mm and the nuts on the Whiteline endlinks are 17mm. It's Sunday, and the shops are closed by this time. We attach both as best we can and leave it until Monday so we can run out yet again for more tools.

Monday we get a 17mm deep socket (no 17mm box-ends to be found), and get the endlinks attached. Day 3, after 5 trips for parts and tools, we've finally got new parts installed on the car.

Step 6: Install the new swaybar. Once the endlinks were on, getting the swaybar on and the first endlink attached was relatively short work. We put them on the softer of the two settings. The new bushings are grease-less, so there really wasn't much to it.

Getting the second endlink attached turned out to be an ordeal. It seemed physically impossible for the endlink to fit. We did some research online and found some people were attaching the endlink to the swaybar first, then to the chassis, so we attempted that, which worked after some finagling.

We found a service manual online for the torque specs, and torqued everything down... except the torque wrench we have is too large to use on the endlinks, so yet another trip to the auto parts store had us a new torque wrench for smaller spaces.

Step 7: Reinstall the crossmember. Just a matter of elbow grease.

Finishing up: At last, we got everything torqued down and took it for a test drive - we were going to do the front and rear at once, but at this point we were three days in and wanted to see some results. Turn-in is much improved, and the reaction to steering input is much quicker, making all that work worthwhile. Re-torqued everything after the test drive, per the instructions.

Part II: Rear Swaybar, Endlinks and Braces


We did the rear the following weekend, so my shoulder had a chance to heal and I was back to doing my fair share. The rear went far more smoothly than the front, but we did run into some small snags.

Step 1: Remove the factory swaybar. This wasn't terribly difficult, it just required some muscle and a breaker bar.

Step 2: Remove the factory endlinks. This was a little bit harder; the lower connection point in the rear crossmember was pretty well seized. A good deal of penetrating oil and muscle on the breaker bar finally got them free.

Step 3: Install the new endlinks. Because of they way they slot into the crossmember, which is a tight fit, this took some finagling; wiggling them back and forth and applying pressure got them into the proper position. We greased the new bolts and hand-tightened them, so they could still be moved side to side to make it easier to get them into the swaybar.

Step 4: Install the new swaybar. Again, greaseless bushings make this a pretty streaightforward task... until we stripped the upper bolts for the D-brackets that attach the bushings. Both of them. I'm not sure what the issue was, but both sides did the same thing - tighten, tighten, fine, then as we tried to torque them down, just as it seemed to be getting close to the proper torque setting, they suddenly got easier to turn again - and kept turning. Not spinning free like it was totally stripped, they just never got tighter.

Since the rear of the bolt is accessible, we picked up a couple of M8 bolts, put them on the back, and torqued it down to spec without issue. Then we got the endlinks attached, putting the rear on its middle setting, and torqued those down. There was also an issue with the lower bolt on the driver's side, which was inaccessible with a socket because the exhaust is in the way; this hand to be hand-torqued. We basically just torqued the bolt on the passenger side with the torque wrench, and then torqued the driver's side with a box-end wrench so it felt the same. Certainly not perfect, but the best we had available.

Step 5: Install the braces. The most difficult part was getting the lower control arm bolt off in order to attach the braces; these seemed to be torqued far tighter than the factory-specified 59 ft-lbs. It was tight enough that we couldn't do it with a breaker bar using a 1/2"-to-3/8"-drive adapter, and had to run out for a 1/2"-drive socket set. A lot of penetrating oil, patience, and torque finally got them off. We attached the braces, which connect the lower attachment point of the D-bracket to the inner attachment point of the control arm.

Finishing up: We torqued everything down and took it for a test drive; all good. The steering is solid and stable, roll is vastly reduced, turn-in is better, steering is significantly more responsive, and much more neutral, with just a hint of oversteer. We lift it back up and torque everything down again per instructions.

A week later, we torqued it all down one last time; you're supposed to do it after the first 100mi but we didn't have a chance until the next weekend with almost three times that. Most of the bolts were still set, a couple were off just a little.

Conclusion

All in all it took us much longer than it probably should have, certainly longer than we thought it would have, and required all sorts of equipment we didn't know we'd need. But the car is better for it, and we get the pride of knowing not only did we install it ourselves, but we were able to navigate a few difficulties along the way and surpass them to get the job done. Not to mention that our garage is a bit better equipped - we spent $5 on nuts and bolts and all the rest was on tools that will last and we'll use again.

We both learned a lot doing it - my wife commented on how far she had come just in terms of knowing the tools. At the start I had to occasionally explain here and there what some tool was called or what the differences between them were, but by the end she knew them all and had no trouble choosing the right socket set for the job out of what was available at the store when we founded we needed something we didn't have.

And you know what? Working on a car with your wife is pretty nice. It might have gone more smoothly and quickly if there was someone there who knew what they were doing, rather than two complete amateurs. But it was a shared experience, it was time spent together, and I can't imagine anyone I'd rather crawl around under a ton and a half of Japanese engineering with.

2014-03-21

A Wrinkle in Time

You've built a prototype, everything is going great. All your dates and times look great, they load and store correctly, everything is spiffy. You have your buddy give it a whirl, and it works great for them too. Then you have a friend in CuraƧao test it, and they complain that all the times are wrong - time zones strike again!

But, you've got this covered. You just add an offset to every stored date/time, so you know the origin time zone, and then you get the user's time zone, and voila! You can correct for time zones! Everything is going great, summer turns to fall, the leaves change, the clocks change, and it all falls apart again. Now you're storing dates in various time zones, without DST information, you're adjusting them to the user's time zone, trying to account for DST, trying to find a spot here or there where you forgot to account for offsets...

Don't fall into this trap. UTC is always the answer. It is effectively time-zone-less, as it has an offset of zero and does not observe daylight savings time. It's reliable, it's universal, it's always there when you need it, and you can always convert it to any time you need. Storing a date/time with time zone information is like telling someone your age by giving your birthday and today's date - you're dealing with additional data and additional processing with zero benefit.

When starting a project, you're going to be better off storing all dates as UTC from the get-go; it'll save you innumerable headaches later on. I think it is atrocious that .NET defaults to system-local time for dates; one of the few areas where I think Java has a clearly better design. .NET's date handling in general is a mess, but simply defaulting to local time when you call DateTime.Now encourages developers to exercise bad practices; the exact opposite of the stated goals of the platform, which is to make sure that the easy thing and the correct thing are, in fact, the same thing.

On a vaguely related note, I've found a (in my opinion) rather elegant solution for providing localized date/time data on a website, and it's all wrapped up in a tiny Gist for your use: https://gist.github.com/aprice/7846212

This simple jQuery script goes through elements with a data attribute providing a timestamp in UTC, and replaces the contents (which can be the formatted date in UTC, as a placeholder) with the date/time information in the user's local time zone and localized date/time format. You don't have to ask the user their time zone or date format.

Unfortunately it looks like most browsers don't take into account customized date/time formatting settings; for example, on my computer, I have the date format as yyyy-mm-dd, but Chrome still renders the standard US format of mm/dd/YYYY. However, I think this is a relatively small downside, especially considering that getting around this requires allowing users to customize the date format, complete with UI and storage mechanism for doing so.

2014-03-13

On Code Comments

I've been seeing a lot of posts lately on code comments; it's a debate that's raged on for ages and will continue to do so, but for some reason it's been popping up in my feeds more than usual the last few days. What I find odd is that all of the posts generally take on the same basic format: "on the gradient of too many to too few comments, you should aim for this balance, in this way, don't use this type of comments, make your code self-documenting." The reasoning is fairly consistent as well: comments get stale, or don't add value, or may lead developers astray if they don't accurately represent the code.

And therein lies the rub: they shouldn't be representing the code at all. Code - clean, self-documenting code - represents itself. It doesn't need a plain-text representative to speak on its behalf unless it's poorly written in the first place.

It may sound like I'm simply suggesting aiming for the "fewer comments" end of the spectrum, but I'm not; there's still an entity that may occasionally need representation in plain text: the developer. Comments are an excellent way to describe intent, which just so happens to take a lot longer to go stale, and is often the missing piece of the puzzle when trying to grok some obscure or obtuse section of code. The code is the content; the comments are the author's footnotes, the director's commentary.

Well-written code doesn't need comments to say what it's doing - which is just as well since, as so many others have pointed out, those comments are highly likely to wind up out-of-sync with what the code is actually doing. However, sometimes - not always, maybe even not often, but sometimes - code needs comments to explain why it's doing whatever it's doing. Sure, you're incrementing Frobulator.Foo, and everybody is familiar with the Frobulator and everybody knows why Foo is important and anyone looking at the code can plainly see you're trying to increment it. But why are you incrementing it? Why are you incrementing it the way you're doing it in this case? What is the intent, separate from its execution? That's where comments can provide value.

As a side note (no pun intended), I hope we can all agree that doc comments are a separate beast entirely here. Doc comments provide meta data that can be used by source code analyzers, prediction/suggestion/auto-completion engines, API documentation generators, and the like; they provide value through some technical mechanism and are generally intended for reading somewhere else, not for reading them in the source code itself. Because of this I consider doc comments to be a completely separate entity, that just happen to be encoded in comment syntax.

My feelings on doc comments are mixed; generally speaking, I think they're an excellent tool and should be widely used to document any public API. However, there are few things in the world more frustrating that looking up the documentation for a method you don't understand, only to find that the doc comments are there but blank (probably generated or templated), or are there but so out of date that they're missing parameters or the types are wrong. This is the kind of thing that can have developers flipping desks at two in the morning when they're trying to get something done.

2014-02-25

New Laptop! ASUS ROG G750JW

I recently received the generous gift of an ASUS ROG (Republic of Gamers) G750JW laptop, and let me tell you, the thing is a beast. Seriously, it's huge.

It's a 17" widescreen laptop (1920x1080 TN panel, no touch thankyouverymuch), with an extra two inches or so of chassis behind the hinge. It also weighs just short of ten pounds.

But, I wasn't looking for an ultraportable. I wanted something that I could use around the house and on the road, primarily for software development, but also for occasional gaming. That meant I needed a comfortably-sized keyboard, trackpad, and display; that meant a 17" laptop. I wanted decent battery life and decent performance, which meant it would be heavy for its size. And I got exactly what I asked for.

The G750JW runs a Core i7 at 3.2GHz, 12GB of RAM, an NVidia GeForce 765m, and a 750GB HDD. Step one was replacing the HDD with a 240GB Crucial M500 SSD I picked up for $135 on Amazon - less than half what I paid for a nearly identical drive just over a year ago. The difference in speed is truly staggering, going from a 5400 RPM laptop hard drive to a full-tilt SSD. It also cut a few ounces off the weight, and added a good half hour to hour of working time on the battery, so a win across the board.

I tried installing Windows 7 on it as I despise Windows 8, but kept running into an error during the "extracting files" stage of the installation. I found numerous posts online from people with the same problem, some of them with solutions, but none of those solutions worked for me; from what I can tell, it appears to be some conflict between the latest-and-greatest UEFI in the G750's motherboard and the aging Windows 7 OS. It's a shame, but I suppose being forced to gain more familiarity with Windows 8 isn't all bad; I just wish I had the option to use something more, well... usable.

Other than the OS though, it's been a joy. It performs extremely well, it has all the features and specs I need for what I'm using it for, and it's a beast for gaming - more horsepower than I really need considering I'm not a huge gamer and gaming was not the primary purpose of the laptop to begin with. Part of its bulk comes from the two huge rear-venting fans in the thing, which do a good job of keeping it cool - something I've had problems with when using other laptops, and which was the ultimate bane of my wife's old MacBook Air. I don't think I need to worry about it overheating and locking up while playing video like the MBA did on a regular basis.

My only gripe at the moment is that it seems to be impossible to find a decent Bluetooth mouse. Sure, the market is flooded with wireless laptop mice; but 95% of them use a proprietary receiver (I'm looking at you, Logitech!) rather than native Bluetooth, which requires you to use the provided USB dongle. That seems like an utter waste considering the laptop has a built-in transceiver capable of handling mice without any USB dongle.

All I really want is a decent-sized (I have large hands) Bluetooth wireless mouse, with a clickable scroll wheel and back/forward thumb buttons. That doesn't seem like too much to ask, but as far as I can tell, it just doesn't exist. Thankfully the laptop has a very generous touchpad with multi-touch, and clicking both the left and right buttons together generates a middle-click. Still, I really hope Logitech gives up on the proprietary wireless idea and gets on board with the Bluetooth standard, because I'd like to have a decent mouse to use with it.

It's telling that, on Amazon, you can find a discontinued Logitech Bluetooth mouse that meets my requirements - selling in new condition for a mere three hundred dollars. That's three times what Logitech's finest current proprietary wireless mouse costs, for an outdated, basic mouse. That's how much standard Bluetooth wireless is worth to people. Wake up Logitech!

Any suggestions on a suitable mouse in the comments would be greatly appreciated...

2014-02-07

Optimizing Entity Framework Using View-Backed Entities

I was profiling a Web application built on Entity Framework 6 and MVC 5, using the excellent Glimpse. I found that a page with three lists of five entities each was causing over a hundred query executions, eventually loading a huge object graph with hundreds of entities. I could eliminate the round trips using Include(), but that still left me loading way too much data when all I needed was aggregate/summary data.

The problem was that the aggregates I needed were complex and involved calculated properties, some of which were based on aggregates of navigation collection properties: a parent had sums of its children's properties, which in turn had sums of their children's properties, and in some cases parents had properties that were calculated partly based on aggregates of children's properties. You can see how this quickly spun out of control.

My requirements were that the solution had to perform better, at returning the same data, while allowing me to use standard entity framework, code first, with migrations. My solution was to calculate this data on the server side, using entities backed by views that did the joining, grouping, and aggregation. I also found a neat trick for backward-compatible View releases:

IF NOT EXISTS (SELECT Table_Name FROM INFORMATION_SCHEMA.VIEWS WHERE Table_Name = 'MyView')
    EXEC sp_executesql N'create view [dbo].[MyView] as select test = 1'
GO
ALTER VIEW [dbo].[MyView] AS
SELECT ...

It's effectively upsert for views - it's safe to run whether or not the view already exists, doesn't ever drop the view if it does exist (leaving no period where a missing view might cause an error), and it doesn't require keeping separate create and alter scripts in sync when changes are made.

I then created the entities that would represent the views, using unit tests to ensure that the properties now calculated on the server matched expected values the same way that the original, app-calculated properties did. Creating entities backed by views is fairly straightforward; they behave just like tables, but obviously can't be modified - I made the property setters protected to enforce this at compile time. Because my View includes an entry for every "real" entity, any query against the entity type can be cast to the View-backed type and it will pull full statistics (there is no possibility of an entity existing in the base table but not in the view).

Next I had to create a one to one association between the now bare entity type and the view type holding the aggregate statistics. The only ID I had for the view was the ID of the raw entity it was connected to. This turned out to be easier said than done - entity framework expects that, in a one to one relationship, it will be managing the ID at one end of the relationship; in my case, the ID's at both ends were DB-generated, even though they were guaranteed to match (since the ID in the view was pulled directly from the ID in the entity table).

I ended up abandoning the one-to-one mapping idea after a couple days' struggle, instead opting to map the statistics objects as subclasses of the real types in a table per type structure. This wound up being relatively easy to accomplish - I added a table attribute to the sub type, giving the name of the view, and it was off to the races. I went through updating references to the statistics throughout LINQ queries, views, and unit tests. The unit and integration tests proved very helpful in validating the output of the views and offering confidence in the changes.

I then ran my benchmarks again and found that pages that had required over a hundred queries to generate now used only ten to twenty, and were rendering in half to a third the time - a one to two hundred percent improvement, using views designed purely to mimic the existing functionality - I hadn't even gone about optimizing them for performance yet!

After benchmarking, it looks even better (times are in milliseconds, min/avg/max):

EF + LINQEF + Views
3 lists of 5 entities (3 types)360/785/167560/105/675
2 lists of 6 entities (1 type)325/790/193590/140/740
1 entity's details + 1 list of 50 entities465/975/268590/140/650

These tests were conducted by running Apache JMeter on my own machine against the application running on Windows Azure, across a sampling of 500 requests per page per run. That's a phenomenal 450 to 650 percent improvement across the board on the most intensive pages in the application, and has them all responding to 100% of requests in under 1 second. The performance gap will only widen as data sets grow; using views will make the scaling much more linear.

I'm very pleased with the performance improvement I've gotten. Calculating fields on the app side works for prototyping, but it just can't meet the efficiency requirements of a production application. View-backed entities came to the rescue in a big way. Give it a try!