nginx hits #4 on Netcraft
Self-explanatory. nginx surpasses Lighttpd and moves up to #4 according to Netcraft. Woohoo!
Self-explanatory. nginx surpasses Lighttpd and moves up to #4 according to Netcraft. Woohoo!
I waited in extreme anticipation for my WiMAX adapter to be delivered. It wound up being a day late due to the weather. I unpacked it and installed the software and plugged it in and ... nothing. No network found. What? WiMAX has no line of sight issues, it's supposed to "just work" - I tried using Clear's 24/7 online chat support. Guy was just a random first level support. Not much help. I decided I would give it a try when I was out in a zone that should definately have a signal.
Right now I'm using it, and I was using it the other day. Both times I've been looking at ~5mbit down and 400kbit up according to speedtest.net. Very nice. Currently it feels like I am at a home wifi connection. So at the moment if this was consistent, I would be sold. However, I am still not able to claim this is the best thing since velcro until I can figure out why I get not a single bar of signal at home.
I tried calling Clear the first time I was out and of course it was working - I thought it was going to fail like it did at home. Doh. Anyway, I need to do some more testing and diagnosis, but right now I have to say if the coverage improves (or someone explains why I get nothing inside of my house when it sounds like I should) WiMAX is pretty awesome. It's going to make me wish my laptop had it built-in so I didn't need a separate USB dongle for it π
I did it. I took the plunge. I ordered WiMAX. Thank you Clear for making Portland your second market for the service! I hope it is all that it promises.
I've got seven days for satisfaction with a full refund, 30 days with a partial refund. Month-to-month, no contract. Sweet. I will let everyone know what it's like. Hopefully I'll be able to get it soon and run around town messing around with it. Too bad there aren't any phones that can leverage it yet in the US. It'd be neat if my iPhone could jump on it.
Contract Term: Month to Month
Plan Name: WiMAX Mobile - FREQUENT - No Commitment
Included: Up to 4 Mbps Downlink Speed
Up to 384 Kbps Uplink Speed
0 E-mail Addresses
2 GB monthly bandwidth
I'm sure I'll be using more than two gigs per month if it works well. It's only $10 more for unlimited. I confirmed you can change contract anytime with prorated charges. They sell home service even, so it must be consistent performance... right? π
Typically, updates on the open source packages work without a hitch. However, my upgrade last weekend on my servers from Ubuntu Hardy to Intrepid wound up creating a couple major headaches, and at the same time, I noticed a handful of other snafoos happening to open source packages I use daily.
This wound up in server instability, client annoyance, and 20-30 hours solid of trial-and-error compiling, testing, debugging, etc. Even right now, if I forget to hold back the libgpac-dev package from being updated, all videos being converted lose their sound due to MP4Box crashing.
Stewart, Brian and Jay were having some fun and were able to implement support for Unicode characters in Drizzle schemas.
I pretty sure that this is not allowed in -any- massively used database. So +1 for Drizzle right?
In my mind, not so much.
First off, it should go to show that if the main website linking to your blog can't even display the character correctly, maybe it's time to rethink if it is something worth keeping.
Second, while this helps prove that everything in Drizzle will be Unicode capable, it creates an utter nightmare for supporting schemas created with high ascii or foreign characters. Some of us are already supporting other people's existing Latin-based schemas. Now add in the need to have a character map open and terminal windows that can copy/paste/display Unicode characters, and the difficulty of working with these has increased ten-fold. Not to mention the filenames storing the databases/tables on the filesystem will become harder to work with too.
Lastly, look at everything out there that -is- Latin-based and nobody is affected, for example:
It is just a matter of life that native Chinese people have to develop code in Latin-based characters, have to design their schemas (currently), and have to visit the majority of websites (if not all) in Latin-based characters. I don't see anyone trying to change C or C++ to support Unicode characters... there just isn't a real reason for it.
To summarize, in my opinion - what seems cool today will become a headache in the future if unleashed. I am totally in support for end-to-end (and only) Unicode support for data and such, but allowing it in schemas are a bit too much.
Not anymore! (Well, basically)
I've changed the default privileges so only registered users can post. It still means that these spammy bots can post if they have an account and/or can still register (I don't think MediaWiki does anything to thwart automated signups) - but this should make it a little bit more managable.
If I run into this again, I guess I'll have to look for third party modules to plug in.
Continuing from http://michaelshadle.com/2008/11/26/updates-on-the-http-file-upload-front/Β ...
I've been doing some research and more hacking. Code should find its way out there sometime soon. Here's my notes since the last installment of the "As The File Upload World Turns"
The only thing missing is a better attempt to see if Gears will retry the upload on a failure. I believe it is possible when dealing with a worker pool but this is -very- basic XHR usage at the moment. Perhaps since it is JavaScript-based we can add in our own re-transmission code. That's the next piece I'm going to mess around with.
Stay tuned for the results... (and code, most likely!)
When you enter the realm of "friendly URLs" "slugs" "nice names" or whatever else you call them, it can make everything a lot better looking. However, if done incorrectly, you can get some duplicate indexed pages and the like. I couldn't sleep and wanted to try approaching this again a different way, and every URI I've thrown at it comes out how I want it.
Why does this matter? Typically, without rewrites and using normal webserver, directory and file semantics, a request for "/foo" should make the webserver bounce you to "/foo/" - but when dealing with rewritten URLs, there is no enforcement of this behavior. A lot of the time (at least with the stuff I'm currently dealing with) the same page shows up with "/foo" or "/foo/" and both are considered unique to a search engine. It's duplication of data which violates the normalization devil in me! Even worse, certain apps might not even process the request the same. "/foo" could load one page, and "/foo/" could load another, or an error. That's worse; when people send URLs out, sometimes they take artistic license with what they look like. This is to thwart all that and force search engines, users, etc. to all view the same URL. I chose to enforce the URL structure ending with "/" as I think it helps establish that "final" signoff of the non-query string portion of the URI.
You could go about this different ways; however, due to the way I have my nginx rewrites done, I can't rely on $_SERVER['QUERY_STRING'] which is how I had originally written it - and I wondered why I was getting some weird behavior. Now I realize I need the function to handle any string that is passed to it, and then I can make this behave appropriately.
Below is the function and some sample code to run it. There's probably a few opcodes that could be saved in trade for a little bit of memory by calculating the length of the URI, position of the "?" if there is one, etc. However, I hate defining variables so much that get used only once (a major pet peeve is when people create a new variable for absolutely no reason, this one would at least save a couple CPU cycles...) - so this is my least-amount-of-code-possible version. Enjoy.
# some example URIs $uris = array( "/bar/", "/bar", "/bar/ee", "/bar/index.html", "/bar/index.php", "/bar/index.php?fds", "/bar/index.php?f=bar&fbahd=3", "/bar/index.php?http://www.foo.com", "/bar/index.php/bark", "/bar/index.php!meow|fJG)*#)$*J:g", "?somehow", "" ); foreach($uris as $uri) { echo $uri." => ".normalize_uri($uri)."\n"; } function normalize_uri($uri) { # if there is query string, we want to chop it off and put it aside if(strstr($uri, '?')) { $query = substr($uri, strpos($uri, '?'), strlen($uri)); $uri = substr($uri, 0, strpos($uri, '?')); } # scrub any index.* stuff off the end $uri = preg_replace("/index.(\S{0,3})$/", '', $uri); # if it doesn't end with a '/', then add one if(substr($uri, strlen($uri)-1, strlen($uri)) != '/') { $uri .= '/'; } # finally, put the query string back on if(!empty($query)) { $uri .= $query; } return $uri; }
You'd tie this in with something like a:
header('Location: http://'.$_SERVER['HTTP_HOST'].$uri, true, 301); exit();
To make sure that it is redirecting with a 301 (search engine friendly) header. (Don't hardcode the scheme - https or http, depending on what your site uses.) Something like this should work:
if(isset($_SERVER['HTTPS']) && strtolower($_SERVER['HTTPS']) == 'on') { $scheme = 'https://'; } else { $scheme = 'http://'; }
This is how I would throw it all together:
if(isset($_SERVER['REQUEST_URI']) && substr($_SERVER['REQUEST_URI'], strlen($_SERVER['REQUEST_URI'])-1, strlen($_SERVER['REQUEST_URI'])) != '/') { if(isset($_SERVER['HTTPS']) && strtolower($_SERVER['HTTPS']) == 'on') { $url = 'https://'; } else { $url = 'http://'; } # fill in $_SERVER['REQUEST_URI'] here with whatever is holding the original URI $url .= $_SERVER['HTTP_HOST'].normalize_uri($_SERVER['REQUEST_URI']); header('Location: '.$url, true, 301); exit(); }
Now, it is 3:30am and I am trying to compose this inside of WordPress, but I believe that will work.
I apologize - for some reason the indentation is not showing up... remind me to add in that neat code sample plugin soon.
Right now I'm happy. Why's that? Because I have a neat Google Gears-based file uploader that uses hardly any code and no special server requirements or code requirements. It is *almost* everything I've been trying to accomplish with my behind-the-scenes type efforts to scare up some development... and this was accomplished so quickly it was scary.
Big props to Raghava @ work. He did the legwork of putting together the base code that makes Gears work properly and enough for me to play around with. Between the two of us I think we'll be able to conquer the rest of it.
Okay, so what exactly was I hoping for?
Right now what we've got pretty much meets most of the needs. The biggest gap here is making sure it attempts re-transmission. Without that, this is still very cool, but that will be one of the major benefits this exercise could offer.
Perhaps soon I will post some code. I'll want to ask Raghava if he cares first. Maybe some JavaScript/Gears gurus could even clean it up or add more functionality.
This is for 2.6.3 - not sure what other versions it will work on. Originally I got this from http://www.untwistedvortex.com/2008/06/27/adjust-wordpress-autosave-or-disable-it-completely/, however,Β it was still giving me a JavaScript error.
Note that this worked fine in FF 2.x, and IE6 for English; but for some reason it broke in Chinese and only in IE6. It kept telling me it was failing to update a post, even though it was a new post. I traced it down finally to autosave.js changing the value of the action to "editpost" instead of "post" - I am not exactly sure why the behavior is different in different browsers, but at this point I don't care. The autosave feature is not necessary for the blog site I am supporting for work.
At first I tried a wp-config.php attempt, from http://sheldon.lendrum.co.nz/disabling-wordpress-auto-save-and-revision-saving_227/04/Β - but that must not work in 2.6.x - it still appeared to be autosaving. The next Google result was a plugin fashion that dequeued the autosave.js script - which was the culprit, but it wasn't quite complete (at least in my 2.6.3 install) - so here is the complete fix (no JavaScript errors that I can tell)
First, you need to make a file called "autosave.js" and throw it in wp-content/plugins, the content of which is:
function autosave() { return true; }
Next, throw this in an existing plugin or make a standalone file (I have a file with a bunch of random WP "fixes" so I just threw it in there):
function disable_autosave() { wp_deregister_script('autosave'); wp_enqueue_script('wphacks', '/wp-content/plugins/autosave.js', array(), '20081123'); } add_action('wp_print_scripts', 'disable_autosave');
A big thanks to the guy at Untwisted Vortex for getting me on the right path. I'm sure there might be a couple other ways to do this, but this works just fine, and maps with the primary goal of not editing the core code of the package.