This is why I find hosting WordPress to be frustrating. It’s a huge security risk to not keep everything up to date. But if you do keep it all up to date, stuff tends to break.
This is why I find hosting WordPress to be frustrating. It’s a huge security risk to not keep everything up to date. But if you do keep it all up to date, stuff tends to break.
SPOILER: It works fine out of the box if you have git’s bin directory in your path.
I was chatting recently with friend of mine, David Jacoby, and he mentioned a desire to purchase a Mac for development. Since he already had a Windows machine, I asked him why.
(A few years ago, I would have simply nodded in agreement. But Windows has been doing a great job lately of putting open source development tech onto their OS. Either by baking it into Visual Studio or paying the open source developers to port it like they did with node.)
He tells me there’s some difficulty when you mix ssh, vagrant, and windows (or at least, there was a difficulty).
I was planning to use vagrant, so I needed to find out for myself. First off I googled
vagrant putty ssh, and then read the linked stack overflow question.
Supposedly it’s been fixed for 10 months. But it was unclear to me if the vagrant ssh on windows fix works with putty or with the ssh command installed with git.
Since I’d have to test to be sure anyway, I installed vagrant and virtual box.
I followed the Vagrant Getting Started instructions:
$ vagrant init precise32 http://files.vagrantup.com/precise32.box $ vagrant up
Vagrant up returned an error, but it was a very helpful error that told me to put the VBoxManage.exe binary on my path. I found it in C:\Program Files\Oracle\VirtualBox, added that to my path, and ran
vagrant up again.
And away it went!
Bringing machine ‘default’ up with ‘virtualbox’ provider…
[default] Box ‘precise32′ was not found. Fetching box from specified URL for
the provider ‘virtualbox’. Note that if the URL does not have
a box for this provider, you should interrupt Vagrant now and add
the box yourself. Otherwise Vagrant will attempt to download the
full box prior to discovering this error.
Downloading or copying the box…
Extracting box…ate: 5401k/s, Estimated time remaining: –:–:–)
Successfully added box ‘precise32′ with provider ‘virtualbox’!
[default] Importing base box ‘precise32′…
[default] Matching MAC address for NAT networking…
[default] Setting the name of the VM…
[default] Clearing any previously set forwarded ports…
[default] Creating shared folders metadata…
[default] Clearing any previously set network interfaces…
[default] Preparing network interfaces based on configuration…
[default] Forwarding ports…
[default] — 22 => 2222 (adapter 1)
[default] Booting VM…
[default] Waiting for machine to boot. This may take a few minutes…
[default] Machine booted and ready!
[default] The guest additions on this VM do not match the installed version of
VirtualBox! In most cases this is fine, but in rare cases it can
cause things such as shared folders to not work properly. If you see
shared folder errors, please update the guest additions within the
virtual machine and reload your VM.
Guest Additions Version: 4.2.0
VirtualBox Version: 4.3
[default] Mounting shared folders…
[default] — /vagrant
Now for the moment of truth, I type in
vagrant ssh and got:
Welcome to Ubuntu 12.04 LTS (GNU/Linux 3.2.0-23-generic-pae i686)
* Documentation: https://help.ubuntu.com/
Welcome to your Vagrant-built virtual machine.
Last login: Fri Sep 14 06:22:31 2012 from 10.0.2.2
So it works great. The only question I have left is, “Is this relying on the SSH command that’s already on my machine? If so, did it come with PowerShell, installing Git on my path, posh-git, Github for Windows, or some forgotten tool?
Update: In the comments, Dave Jacoby posted a command to check your path:
($env:Path).Replace(‘;’,”`n”). We determined that having Git’s bin directory in your path is required.
I’ve been thinking about writing about the importance of owning your own content, then I saw this. Well said, Scott Hanselman.
Good guide, but you’ll want to walk through the steps in Google Analytics as you read it to fully understand.
Every now and then I get a request to change the behavior of the links to PDFs on our website. Currently we follow Jakob Nielsen’s recommendation on the issue:
… prevent the browser from opening the document in the first place. Instead, offer users the choice to save the file on their harddisk or to open it in its native application (Adobe Reader for PDF, PowerPoint for slides, etc.).
It’s good advice, but some people find it more convenient when PDFs open within a new browser window.
I explain that this can be problematic for a handful of reasons.
Then I offer some links for further reading.
When I dig a little deeper, it turns out that the stakeholder just wants their content to be as accessible as possible. I share that goal, and suggest the best way to do that is to convert the content into a web page. It makes the content more accessible, reusable, responsive, and more easily indexed by search engines.
Of course, converting it takes some additional effort. If the content is valuable enough to change the way links are handled across the site, then it’s valuable enough to put into HTML. And if there’s no time to do it by hand, Google offers a free document viewer that displays the PDF as HTML. Or if it’s an infographic, export it as an image file and put that on a web page.
Short version: you can cite stuff in the blockquote element, as you would expect. Unexpectedly, this was contrary to the initial HTML5 spec.
I decided on node and got the current version of the my app migrated into it in about an hour thanks to express and jade. Luckily, I just found out that Heroku now supports many versions of node and npm, up to the latest. You just need to specify the node version you want in package.json.
I tried to get it working initially with iisnode and WebMatrix. But I wasn’t able to figure out how to get express working with it. Come to find out, it’s actually an open bug. iisnode is more than I need anyway for this simple app.
I’m not here to teach, but I wrote this for a web marketing blog at work that never went live. Didn’t want it to go to waste.
Thinking about a website upgrade? Is it on a new system with all new pages? Is anyone thinking about the links to the old pages? You’ll want to make sure you can identify and redirect the most important ones for the sake of your users and your Google rankings.
There will be various ways to get this list depending on your setup, but one good way is to get a report from your analytics tool (Google Analytics is a popular example). Most tools can provide a list of the most commonly visited URLs over a certain date range. Set that range to a year or so and you’ll have a hefty list of your most important pages. Next you’ll want to use your analytics software’s export feature (most will have this capability) to save the list into a text or excel format. If you don’t see an Excel specific format, look for an export format called Comma Separated Values (CSV) which can be opened by Excel.
While your website may have had many more working links, discovering them and redirecting them to appropriate pages on the new site will offer diminishing returns due to their light use.
Now that you’ve got the list of URLs you want to keep alive, you’ll want to identify the best matches for them on the new site. The matches don’t have to be exact. For instance, we redirected all our old individual calendar events and news items to our new “News and Events” page. That said, avoid redirecting the entire list to the homepage of your new site if you can. This isn’t helpful for users and Google even explicitly recommends against this in the great video on this page about 301 and 302 redirects. That page even includes some links on how to actually code these changes on your web server.
If you have questions about any of the specifics just post in the comments!
I’ve spent a lot of time at work lately struggling with how to send a form submission to two different places at once.
I started just using the jQuery get method. Why not, right? That’s all I wanted to do, was make a one way request to a remote server. It worked fine as expected and I moved onto another project.
Unfortunately, I’m spoiled by jQuery’s usual similarity across the various browsers, and I didn’t test for the data going through in any other browsers. It turns out that while Chrome’s behavior is according to spec, it’s still not something the other browsers permit.
The wikipedia page for jsonp explains how it’s done on the backend best. Basically, it dynamically adds a script tag with “src=” set to the remote server and the data you want to send in the query string.
Then we ran into another weird problem. It would work in Firefox while debugging, but not in practice. Which means Firefox was executing the post command in the form before letting the jQuery finish the getJSON command. Since we needed to pop up an alert for the user anyway, I just added that after getJSON and it worked flawlessly. But in other scenarios there’s still the open question, what’s the best way to give it enough time to finish?
Using JSONP to send and receive any kind of sensitive data is dangerous as any malicious site could make the same request.
I wonder if there isn’t some simpler way to do this but the alternatives sound just as messy. i.e. dynamically opening an iframe with display set to none.
Comments weren’t working on this blog for some reason.
When trying to post you’d be redirected to the post’s URL followed by /comment-page/#comment-. But, it should look more like this /comment-page-1/#comment-74. I’d also get an email about a new comment, but all the normal fields would be blank, i.e. Author, Email, URL, WHOIS, and Comment.
Googling the problem I found this issue is usually caused by bad permalink structures. So I made sure the htaccess file is writable and only contains the automatically generated wordpress lines. I even temporarily removed my root folder htaccess file for a minute but that didn’t fix the comments (though it did have all the folders display indexes, 1990s style.)
Since I do use a custom link structure (/%year%/%postname%/) I turned it off and still had the same problem. With it off I got this URL:
And that URL did not 404! But there was still no new comment. Looking at it I realized that the comment should have an ID on the end, but didn’t. So it wasn’t a permalinks problem, it was a “comments aren’t getting saved to the database” problem.
I updated all the wordpress files and switched to the new theme, just in case. But since that didn’t solve anything I logged into cPanel and was able to run a check and repair on the database. It threw some nasty errors, and then the repair fixed them all in the span of about 60 seconds.
Today I Learned: 1. Databases can trip and hurt themselves. 2. Timeout errors while uploading via FTP may be due to being on a virtual machine.
Still ranked 6. Google has picked up my new blog title and my previous post. The current list looks like this: RP’s (robert pate’s) on the White Pages, an RP on linkedIn, an RP on Facebook, the RP wikipedia entry, the RP Cal State page (cstv), and then me. If you’re testing your own stuff don’t forget to clear your cookies or just open a private browser session. Google will customize your results otherwise.
Next question: Should I 301 or 302 from robertpate.net to robertpate.net/blog? hanselman.com uses a 301, but I haven’t run across a lot of other blogs doing the same. Currently I’m using a 302.
I think I’ll test that next. While I worry that doing a 301 redirect may make my root domain less potent if I ever added content there, I will do it anyway, for science!
Unfortunately, the official list is worthless to someone who doesn’t already understand the differences between the basic types. In googling the issue, I found a few helpful resources: a “quick ref license chooser” which is a great idea but didn’t help this noob a whole lot, and this video from redhat entitled “Open source software licenses explained.” The video was the biggest help and is worth the 6 minutes it takes to watch.
But what I really wanted was something as simple as creative commons. But I couldn’t find one so I drew up this comparison. The licenses are obviously not the same. Nor are they compatible in many cases. This is only a loose comparison. But I’m hoping that this should still increase understanding for those coming from FSF to CC or vica versa.
Each of these CC licenses also has a NonCommerical variant that prevents commercial use, but I couldn’t find a parallel to it in free software licenses. Why that is could probably be a whole separate blog post.
For further reading, check out this David Wheeler post on why you should use GPL for your software, the BSD licenses Wikipedia entry, the GNU instructions on how to include GPL in your project, and CC faq entry for why you can’t use a creative commons license on software.
I’ve moved my blog around a lot over the years. I did this again recently because I realized I was getting 9th place in search results for my name. As a web admin I should be ranking higher simply because I know how to setup a site for good SEO.
So I decided to fix my site up and at the same time test a lot of the best practices.
Since i’m targeting Robert Pate as keywords, i changed the title of my blog from robertpateii to Robert Pate II. I’m not sure how google treats spaces, but I figured an exact match would be better than a psuedo match. I moved back to robertpate.net instead of robertpateii.com for the same reason.
Also .net made more sense to me, subjectively. I’m not a commercial enterprise. I’m not a networking company either, but at least it’s the right industry for me.
I also made sure I was redirecting everything to www so that there isn’t any duplicate content. Redirected all the old urls from when the blog was on robertpateii.com, and finally updated wordpress manually.
Oh and I posted this so that there’s some fresh content. In a week or two i’ll update this with my new google rank.
June 11th, 2011 Update:
6 days later and i’m now ranked 6 for Robert Pate, but google hasn’t picked up my new blog title yet. The ones in front of me are whitepages, linkedin, wikipedia, cstv.com, and justia.com. That’s not bad for having little content and no one linking to this site. There are 2 obvious next steps: go through all my online spaces and link to my own domain and adding useful content.
July 8th, 2011 Update:
See the SEO Update Post.
Do you use wordpress too? Are you more interested in awesome than in hope?
This is not just a plugin, it symbolizes the awesomeness and enthusiasm of an entire generation summed up in six words sung most famously by Dick Valentine: You Must Obey the Dance Commander. When activated you will randomly see a lyric from “Dance Commander” in the upper middle of your admin screen on every page. It can be active at the same time as the original Hello Dolly plugin.
Update on January 14th, 2011: I’ve uploaded a new copy of this plugin that changes the styling back to the original settings. This means you cannot have both on at the same time, but I think it looks better. Here: Dance Commander Plugin – Original Styling
You probably didn’t see or don’t even remember the little tiff in august that Time Warner had with ESPN/ABC and Disney.
Or the one in december 2009 with FOX.
That’s nice that they kissed and made up, but it’s probably for the last time. The whole model should be, will be, shifting as the internet gets faster and the cable networks wise up.
TWC is just a middle man when it comes to television content. And in this age of internets, middle men are going out the window. Consumers and producers both benefit from direct exchanges, but these direct exchanges are traditionally inconvenient to arrange for both sides. Thus the need for dedicated middle men. The internet opens up distribution by making these exchanges easy to find and execute.
TWC should get ahead of the curve and focus on making their internet faster and cheaper. Let the companies that actually produce the content sell it directly over the internet to the consumer. They’ll have to do this in order to get ahead of google, verizon, and ATT fiber networks and even the growing 4G services (i.e. Clear, Sprint, and now T-Mobile kind of ). Otherwise in 10 years Time Warner will find itself with a shrinking percentage of the ISP market and a dying cable television model.
Unless . . . the internet doesn’t get faster. If you dig into the ESPN/Disney agreement, they say “Subscribers will also have unprecedented digital access to online content and expanded Video On Demand services.” But that digital access is now being “authenticated.” It appears free, but you’re really paying it with your cable fees. And if net neutrality gets destroyed, these authenticated services will run fast while all the other competing online options will run slow. That’s why the cable companies, the middle men, are so hot for bringing down net neutrality.
I’m hoping Google TV works with all of these services – both free and authenticated. I’m sure Apple won’t play nice since they’re working on a competing product, but that’s Apple for you. As long as we get a market place for video that’s open and competitive with multiple providers, the consumers win.
1. Access to a new authenticated service, which will give Time Warner Cable and Bright House Networks subscribers the opportunity to watch the linear networks ESPN, ESPN2 and ESPNU through their broadband services as well as mobile Internet devices, like an iPad. Details on the launch will be forthcoming.
I’ve been working on a user forum for my company. The solution we’re using has built in translations of the interface, but the translation of the user-generated content is necessarily a completely separate project.
I haven’t seen many other sites translate it either. Judging from the forums I’ve used, it’s because this tech is beyond the current scope of most forums’ capabilities. But happily there are a few neat things being done these days, such as Ted Talks allowing open translation of their talks and Meedan enabling multi-lingual dialogue.
The Meedan article is especially interesting. They use automatic machine translation on every comment, and allow open editing by translators. It’s my hope that this kind of crowd sourcing and good machine translation can out-pace the compartmentalization of the internet caused by language barriers.
Such implementations are not free or easy to implement, even if you’re leveraging the crowd. But English is not going to remain the common language of the internet forever. Does the possibility of three or four different internets worry you? Have you seen other websites out there handling this well? I bet that someone, somewhere, is hard at work on an open-source project to solve this problem.
Enjoyed the movie but loved the book. It carries a lot more depth, asks a lot more moral questions of the reader, and develops the plot in a completely different manner and direction.
While I don’t think Deckard’s version of the earth will ever come to pass, it’s still a relevant book for all the questions it asks the reader about what defines a person/soul.
It’s also funny to see science fiction age, i.e. Deckard reading smudged carbon copies in a hover car, and using a pay video phone because no one has mobile phones.
here’s a quick idea. First, QR codes are similar to barcodes but are square, hold more information, and are easily scanned with a digital camera lens like the one in your phone. No cumbersome laser scanner needed.
Just download the QR Scanner application that’s commonplace on Japanese phones and catching on here in the states.
is capable of reading a QR i.e. three stickers on a rental movie box. One for good. One for bad. One for ok.
Internet-wise, crossposting first meant posting the same message across many different usenet groups. For us and all the others in the online job searching industry, cross-posting means posting one job on another job board. This can happen time and time again so that by the time a candidate sees a job, he’s got to jump through a bunch of different boards (somtimes registering) to get back to the original that will actually let him apply.
We’re one of those original sites. Recruiters are pesonally logged into our network looking at our candidates’ profiles. But we’re also new to the scene, so in order to get our jobs out there we have to do a lot of crossposting. And it’s not free, of course. Crossposting is how a large subset of the industry makes its money. There’s even a handy dandy site out there that will manage a lot of your crossposting for you as a one stop shop.
What’s neat about most sites (including the one stop shop) is they all take alot of basic html tags like lists, bold, line breaks, and paragraphs. We, however, don’t support html formatting within our system (yet anyway. there are some security issues, but I think eventually we can work around them.) So in order to crosspost our 60 or so jobs a week, we have to go in first and mark them up with html.
Now I fancy myself as an HTML/CSS hobbyist, but I think one of the little tasks I do that drives me the craziest is formatting html en masse for all the jobs we’re crossposting. I’ll do a bold tag here, an unordered list tag here, a bunch of lists tags to replace bullets there, and then I’ll do it another seventy-nine times.
Fortuntely, though, there’s a free tool perfect for this job called HTML Kit. While it has a bunch of features that are way beyond my skill level, it also includes the capability to bind a combination of keys to insert text, specifically tags. So I can simply highlight that list and hit F7 and it’s wraped with unordered list tags. Very handy.
Well, it’s 4:37pm on thanksgiving eve. I’m going to wrap up and get out of here, as soon as I figure out who’s going to check up on the india team while i’m gone.
My career at itzbig started on 8-15-07.
Actually, it started about three or four weeks prior. I’ll just say that in the job search, no means no, but silence means maybe! This is especially true in small to medium size companies.
My initial impression was one of awe. I was really going to work for a true blue startup company. Throughout college I’d always lived in a quiet horror of going into the machine of a big multinational corporation. I knew it could happen because I wouldn’t say no to one. I like to give everyone a chance, and I knew my dislike of them was a little irrational. (Just a little. I’m sure at some point I’ll start a rant on how giant corporations break the free economy, and thus freedom.)
But I had lucked out! By the grace of God, and the McCombs Business School Alumni Job Board, I’d found my way into my own little start up company. Just in time, too. Things started gearing up even faster after I arrived.
Not quite right after I arrived, though. There was definitely a calm before the storm. I was staying busy learning all the tasks that my predecessor had been taking care of, and adding about fifteen or twenty jobs a week into the system.
Then after about a month, on a Thursday, my boss tells me he’s got about 300 jobs queued up in his inbox. I guess the sales guys started trying or something, but my jaw dropped. I’m pretty sure he cackled.
Maybe he didn’t, but regardless, by Tuesday we had three temps from one of the local temp services that use our network.
It was kind of rocky there at first. They sent us some people interested exclusively in data entry, when the job actually requires about 80% data analysis and the rest as data entry. So after a few days, getting caught up on the entry, we basically had 2 people who could only do the five minute job data entry job waiting on 2 people to do the thirty minute data analysis job. And that 2nd person was me, who often gets called away from analysis for tech support or other details.
But one of the data entry only people was called away to her true passion, flower arranging. And they replaced her with another data entry only person. I kept her busy with data entry for most of the week, but the last few days I had her try the analysis. She kept getting real frustrated with it and didn’t seem to enjoy it at all. She didn’t come back on Monday, though I think that might have been of her own will.
Next we got Theresa, who immediately took to the data Analysis along with Martina, who’d been doing it since the third day. With them focusing on data analysis and Anna dedicated to data entry, we finally got production running steady. I still jump in to pick up any slack, but mostly now my time with production is spent breezing over the job files sent to me by Gino for jobs we don’t support and for nasty surprises.
Tech support also keeps me busy, since I’m the single tier 1 support operator. It comes somewhat naturally, though, since I’ve always made myself available as a helper for the online games I’ve played in the past.
Gino: Gino’s been a great recent addition to our team. He took over for my boss the task of managing the incoming jobs from success managers and passing them off to me. He’s the contact point for all the sales team and success managers, so I can focus on working with the temps to get everything into the system. Since my boss was doing it before him, it frees up a big chunk of his time which lately he seems to be spending on developing automated accurate reporting tools between our SQL server and Excel.
And most recently (today) we hired an intern on from UT. Today’s her first day, so I gave her my “Production Walkthrough” and let her rip. We’ll see how it stands up to real use by someone new.