Firefox got real slow

One day Firefox just came to a crawl when running web pages with JavaScript. Then I found out what was causing it and how to fix it. Here’s how.

I was using the Firefox 3.6.16 web browser and it got really slow.  Unbearably so.

When I went to look at the console, it was filled with all kinds detailed of JavaScript warnings, the kind one might expect out of JSLint.  It was spending so much time checking the JavaScript code that it was barely spending time executing it.

The “culprit” was the Web Developer add-on.

Normally, the Disable / Disable JavaScript / Strict Warnings menu item is checked.  Unchecking it gives some fantastic diagnostic messages when writing code.  Forget to re-check it and regular web usage may come to a crawl.

Aside from the console being a clue, you may also see a caution sign or a red circle with a white ‘x’ in it on the Web Developer toolbar.

For normal speeds when regularly browsing, just disable strict checking.

Thick Menu Separators in ExtJS

Do your menu separators on Ext web applications suddenly appear as thick bars instead of thin lines? Here’s how to fix it.

I ran across a problem in Firefox 3.6.16 where my JavaScript applications produced with ExtJS were producing menu separators that were extra thick.

Even Ext demo pages did this: Menu Separator

Here’s how I solved it.

The problem can be addressed with the Web Developer add on.

Normally its Disable / Disable Minimum Font Size is checked. If it gets unchecked, Ext behaves very badly.

Check it and then reload the page; menu separators will be back to normal.

Selenium: HTTP Status 404

Warning, another geeky log entry.

Today I was working with Selenium, the free web testing tool.

I ran into an interesting problem where I was trying to connect to an SSL connections, but I got this error:

HTTP Status 404 – /selenium-server/core/RemoteRunner.html

If you don’t know why /selenium-server/ is being stuck onto your destination URL, or why you are passing a URL to the class constructor DefaultSelenium in the first place, then you need to go read the Selenium RC: Tutorial and not casually browse it, specifically the section entitled The Same Origin Policy.

  • Advice about certificates wasn’t working.
  • I discovered Firefox 3 Beta wasn’t working with Selenium IDE, and switch back to Firefox 2.0.0.16, which at least allowed http connections to work.
  • I used my own Firefox profiles with *custom instead of *firefox

I even installed the Cyber Villians CA certificate, which is mentioned in the Selenium RC: Tutorial under Support for HTTPS.

Still, I was getting the 404 Error when using https, but not http.

Then I found it.

Firefox’s profile got messed up as I was switching between versions. Under Tools / Options…, Network Tab, Settings…, Manual proxy configuration, while HTTP Proxy had localhost and port 4444, the “Use this proxy server for all protocols” became unchecked. It should be checked.

Made sense, too. If https is not going through the proxy, then Selenium couldn’t do it’s magic.

Also, make sure that No Proxy for is blank, this is normally localhost, 127.0.0.1, and other local resources; only in this case, you do want to go through your local Selenium proxy.

Firefox Slow Page Load – Solved

Firefox 3 slow? 20 second page load times? Figured out why. And how to fix it.

A co-worker showed me an interesting problem with Firefox today. He loaded a page from our application (running on localhost) and the page content loaded instantly, but the page load itself didn’t end until a time out 20 seconds later. Literally.

Everything we saw a measured from the browser or from the sending application showed that the content was sent in milliseconds, and the page load was just sitting there doing nothing. We were even using the latest Firefox beta.

Other browsers had no such problem.

Turns out, we figured out what was going on using the Tamper Data add-on.

Turns out there was a Connection: keep-alive in the header. When we changed it from keep-alive to close, the browser behaved as expected. That is, it loaded the page instantly.

A little web investigation showed that when you use the keep-alive attribute, you must also use Content-Length: header, which the sending application wasn’t doing.

A quick application tweak to send the content length, and everything ran super spiffy.

Now, if you don’t have access to the application that’s sending you web pages, you can twiddle with the about:config and change the network.http.keep-alive setting to false.