|
Sunday, January 18. 2015Ignominious End to the T-Mobile experiment
So I'd discussed my first month as a T-Mobile customer Here.
The experiment is busy coming to an end, and while previously I'd been kind of annoyed, I am now quite angry. For all their "uncarrier" blatherings, this is a seriously broken company. First, the pleasant part -- all the customer facing folks I encountered were polite and behaved like they were interested in quality of service. But this does not extend throughout the organization; some elements of T-Mobile are sheltered from customer service considerations and show little interest. What's more, they seem to have a lack of ability to consider the customer point of view. The specific issue that caused me to decide to leave is that their coverage map, which shows that I should be roaming onto ATT in the area immediately around my house, is not telling the truth. Even if I turn off automatic carrier selection in my phone and try to manually go onto ATT's network, it doesn't work. I know the phone is ok because on a long trip to NC and back, i frequently saw the phone switch over to ATT - the phone is ok, this is a local problem. I have opened three tickets on this subject, all of them have been closed by Engineering for various reasons. Engineering is clearly not concerned about this coverage issue; closing tickets seems to be the metric by which they are being judged, rather than actually fixing problems. This does not make for good customer experience. Moreover, I have complained to them point blank that their coverage map is wrong, the coverage seemingly isn't going to be fixed, and this effectively constitutes false advertising - and i see little evidence that anything will be done about that, either. and it's the bogus coverage map that caused me to take this gamble. T-Mobile has a wifi calling option, which i've been trying to use. it is erratic, likely because my TWC cable broadband is erratic. I also find that frequently i need to power cycle the phone to make it work, even though my phone always acquires my home wifi quickly. So earlier today, a T-Mobile rep on their twitter help line offered a "CellSpot" wifi router which they're sure will help with the reliability of the local wifi calling. i don't really see why it will work - TWC broadband has always been flakey where i live, and while T-Mobile's willing to loan it to me rather than charge me the outright $200, they do want a $25 deposit for the bloody thing - and it still won't fix the broader coverage issues. so they have a non-solution to the coverage problem that will only cost me an additional $25 (from all my years in IT, i tend to have a very bad reaction to anyone that responds to their own poor performance by trying to separate me from more money.) At the ATT store, when i called about the phone lock (more on that in a second), they suddenly offered me a cell phone booster thingy. Now about the phone lock - back in december, i bought a full priced phone and went on no contract pricing. it turns out that they lock the full price phones and you have to be a customer either 60 or 40 days (60 according to the phone rep, 40 according to the web site, no idea which is actually true.) i haven't been a customer of theirs long enough, but i'm having a lot of trouble convincing myself to suffer this service any longer. financial prudence, though, is likely to require that i go back and suffer. i do wonder if they would continue the unlock clock, or reset it back to 40 (or 60) days in order to stretch out the suffering. but for the moment, i have a useless iPhone 6, locked to T-Mobile with an ATT sim in it. Cell phone carriers wonder why we hate them. or maybe they know. Thursday, January 1. 2015fun with T-Mobile
so back in December, i switched from Verizon to T-Mobile. i had heard all the stories about how bad T-Mobile's network was, but I have come to despise the subsidized phone pricing model that dominates in the US and I really wanted to give T-Mobile a chance. I purchased a full price iPhone 6 from T-Mobile about 10 days before my Verizon contract expired and activated it using a temporary number in order to get an idea of what was coming. I've been rather disappointed to date but there is still a possibility that T-Mobile might redeem themselves.
I live in a fringy area for T-Mobile's towers. I see weak T-Mobile signals in my driveway (south side of the house) about 1/2 the time; in my office, which is on the other end of the house, T-Mobile signals are rare and not usable when they do show up. However, AT&T is supposed to carry the load for that; they partner with T-Mobile around here and have a usable network here in Averill Park, which i know from my years as an AT&T customer. when i went to activate my phone, well, nothing doing. the first T-Mobile rep told me that the problem must be the brand new SIM that was preinstalled in the phone, and directed me to take the 1 hour round trip to the nearest T-Mobile retail outlet to swap SIMs. i combined it with another trip to reduce the pain level. the T-Mobile retail store was a trip; the salesmen were all quite sure that T-Mobile service never worked in Averill Park, and didn't want to replace the SIM. i pointed out to them that T-Mobile's published coverage map included Averill Park, and insisted on the SIM replacement (even though i was sure it wasn't going to work) because of the importance of jumping through the hoops properly when dealing with tech support. they seemed to be pretty oblivious to the issue of the marketing materials claiming service worked here and to the issue of proper interactions with tech support. i got the SIM replaced and left. as i expected, it fixed nothing. the second call to tech support came about the time i actually ported my verizon number over to t-mobile (on the 15th of December). we worked through a bunch of stuff; the tech support guy got my phone working over the local wifi (wifi calling being a new feature to me); this means that usually my phone works within range of my wifi at home, which saves things (i work from home most days), but it's not entirely reliable. he also walked me through manual carrier selection, and we thus figured out that my phone can see the AT&T towers around here, but won't roam onto them, despite the partnering agreement. this is something that could be a provisioning screwup on either T-Mobile or AT&T's part. he opened up an engineering ticket and told me to wait at least 72 hours. i don't know the details of what he put in the ticket; hopefully it was about the failure to roam onto their partner's network. So the holidays came up, and i decided not to pursue the situation until we returned from visiting family in NC. on the long drive there and back, i observed the phone successfully switching to AT&T in a number of gaps in T-Mobile coverage, which showed that this wasn't system wide and that my phone was not defective; all good information to have. so two days ago, i poked at T-Mobile's twitter help account with a stick, asking them what was up with the ticket. i also drove to the post office - not just to pick up my mail, but also to do the manual carrier selection thing and verify that i still couldn't roam onto AT&T's network - and i can't, so the problem isn't fixed (the idea behind driving to the post office is that there are no T-Mobile towers reachable from there, so there is no chance of a poor T-Mobile signal interfering with roaming.) the word from the t-mobile twitter people came back - the ticket was closed, apparently because engineering saw my phone show up on t-mobile's network. which of course wasn't the actual problem. so there's a new ticket open; we'll see what happens this time. but the closure of the ticket has me wondering if the performance metrics for their engineering group have to do with closing tickets and the length of time tickets are open, instead of actual successful solution of problems. because closing a ticket should not be confused with solving a problem; when you make ticket closure alone the criteria, you create a perverse incentive to close tickets regardless of whether solutions are found. anyone who has any experience in running call centers and engineering support is supposed to know this. Sunday, October 12. 2014Reshaping the Network Course - Resumed
I had stopped working on improving the course, and posting on this subject, as it was unclear if I would be teaching it again in the spring. Now this appears to be cleared up, so I'll resume posting about what I'm doing.
The highest priority is improving the projects. The second and third project descriptions are very sketchy by comparison with the first project; I need to bring them up to the standard of the first project. In addition I want to add a fourth project which involves adding SSL to one of the first three projects. Finally, I want to have them all install Wireshark and learn how to use it to monitor the packet exchanges in their projects. The next priority is the textbook change. At this point, I think we'll go with Kurose & Ross, Computer Networking: A Top-Down Approach as starting from the application layer may help in many ways; every other textbook I've looked at takes the old fashioned bottom up approach. This will require me to reorder my lectures as they currently track the bottom up approach, but I don't think that this will be a huge problem. I am temporarily giving up on adding the network simulator, there is simply not enough time to set that up properly. Monday, June 9. 2014on the Ineptitude of Phishers
hmmm, just got an email about a USPS delivery, need to print a label and take it to the post office.
but hey, the mail from domain is buran7.beget.ru, and the payload or link i need to click on is missing from the message. whew, guess i dodged a bullet there. Computer Networking - purpose of the course
Deferring again talking about implications of online content...
There are two different ways to play a Networking course. Is the room full of potential Network engineers, or is it full of potential application developers? It makes a difference. In the case of UAlbany, the course is offered in the framework of the Computer Science department, and the students tend to be mostly software oriented. This means that I should be trying to make them comfortable with socket programming paradigms, which are a bit different from straight line single threaded development exercises. Socket programming is also what many (most?) of the full time faculty think the course is supposed to be about. But this doesn't excuse me from teaching a lot of the lower level stuff. I've encountered a few too many software developers, who while competent on the development side, are more than a little vague on how some of the networking stuff actually operates. You can write a socket program without understanding how the client side determines a port number, but you may find yourself at a loss when trying to use tools like nmap, tcpdump and wireshark to poke at your application with a stick, and may be pretty clueless about firewall setup as well. So at the bare minimum we need to talk about IP and friends, and it makes little sense to leave out layers 1 & 2 (ISO model) when talking about everything above them. So the future evolution of the course will need to improve on the socket programming side while still providing the students with a solid foundation in how the various layers in the reference model work. And there will always be a need to cover things like clocks and NTP because I can't see them encountering that stuff anywhere else before they graduate and move out into the real world. Computer Networking - 2014 Syllabus
I'm going to defer the promised posting about the implications of the online content of some of the Pearson books for now, in part because I'm going to be getting evaluation copies of some Morgan-Kaufman books and they have online content as well, best to go through both sets of terms and conditions and think about what they mean first. In the meantime, here's the Syllabus from this past spring - one task will be to compare the syllabus with the content of each book to compare coverage.
Saturday, June 7. 2014Reshaping the Computer Networking Course - the Textbooks
So I have a number of textbooks in front of me, with at least one more coming. Four of the six are straight up networking textbooks, the fifth is a classic on Unix socket programming that would be a supplemental text, and the one coming is an O'Reilly book on Java Network Programming that would also serve as a supplement.
Where do these evaluation copies come from? I actually paid cash for my Kindle edition of TCP/IP Illustrated, Volume 1 a year and a half ago as I jumped into the course with very little notice. O'Reilly has an ebook based evaluation program, and I expect to receive a link for a copy of the ebook version of _Java Network Programming_ "any day now". The remaining books all came from Pearson, who provide a choice - either a traditional paper copy, or web access. No ebook option is offered. I would have preferred ebooks, and selected the paper book option as otherwise I would only be able to access the books when I had an internet connection. The first thing to note is that of the 6 books, 5 are from Pearson, who have acquired Prentice-Hall and Addison-Wesley and thus moved into a dominating position in publishing for the college and university marketplace. So while there is certainly textbook choice, there isn't much publisher choice. Another element to consider is that ebook editions are available for all of these books except for the Unix socket programming text, although the price break for the Kindle editions is around 10%, so that $124 networking text doesn't get a whole lot cheaper in Kindle form. Of the four networking texts from Pearson, two have "premium content" online. For both books, there is a scratchoff strip inside the front cover with a code that is good for 6 months of access. Presumably Pearson wants the instructors to use the online material in support of their courses. I considered this, and realized that it represents a fairly naked attempt to kill off the used textbook marketplace - the codes are onetime use and if used, the value of the used textbook drops rather sharply. I'm not playing that game. While I might adapt one of these texts, the course will not require that the students access the premium material. In the case of one of the "premium content" books (Stallings, _Data and Computer Communications_, 10th edition) the two chapters on computer security are behind the online premium wall. I find this particularly egregious, and so the Stallings book is DQ'd from consideration from the very start. Now I'm down to 3 straight up networking textbooks. The other book with premium content (Kurose & Ross, _Computer Networking: A Top Down Approach_) doesn't hide critical material in the same manner, so it remains in the mix for the time being. The premium content is related to class exercises and I can manage that on my own. It will be the student's choice whether they scratch off that strip. The other two that are in the mix are Fall & Stevens (_TCP/IP Illustrated Vol 1, 2nd Edition_, the text I've been using), and and the 5th edition of Tannenbaum & Weatherall's classic _Computer Networks_. So what are some of the differences and things of note? Stevens and Tannenbaum both start from the bottom with layer 1 (the physical layer) and work their way up through the network stack. If I use one of these, I have to tinker with supplemental material so that I can get socket programming assignments going early on during the class. Neither of these books really pretends to deal with network programming anyway, but it's good for students to have at least some understanding, however cursory, of UDP and TCP before you tell them to start coding. Kurose & Ross is structured very differently (It's that "Top Down" part of the title.) In looking it over, I can't help but think that this approach is a really good idea for several reasons. It puts me in a better place to give out assignments early on, and I think it's probably easier to motivate students by starting with applications, which is something they've already seen even if only as black boxes (e.g., web servers and mail servers.) Additionally, they do cover network programming - but they use Python (previous editions used Java). So if I go with this text, I'd want to have the supplemental textbooks available as I'm not inclined to dump a new language on students who are already dealing with lots of other things. The fifth book is Stevens' _Unix Network Programming, Volume 1_ which would be a supplemental text for the C developers, and the sixth is the aforementioned O'Reilly book on Java Network programming. I will probably look for one or two other supplemental texts, but this is really the least of my concerns. So basically, I am leaning towards Kurose & Ross despite the premium content issues because I like the order of presentation better. Now to start really looking at the books hard to see what's what. The next blog posting will discuss some of the challenges I perceive to exist given the online content. Thursday, June 5. 2014Networking - Course Construction vs. Textbook Construction
I've taught the Spring course in Computer Networking at UAlbany for two years now. The first year, it was a bit of a surprise and I was playing catchup from day one. The second time, it was a little less of a surprise, but I still spent a lot of time playing catchup. One of the results of this was that for a textbook, I defaulted to using the 2nd edition of Steven's classic TCP/IP Illustrated, Volume 1, because I was familiar with it and liked it. But I've become concerned that there was a bit of divergence between the course syllabus and the book, so I decided to check out some of the alternatives. I now have copies of 4 other text books to review, and have found a large can with many worms inside. Since it's tolerably certain that I'll be teaching it again, I was going to work steadily on improvements over the next 8 months in any case, but just the quick once over of the textbooks suggested other things I should look at.
But first, I should describe the basic course parameters. It's offered by the Computer Science Department, so the presumption necessarily is that the average student will have experience in writing single threaded programs that run on a single computer, and limited knowledge of the hardware side of things. My observation on the student mix is that, having been given the choice between writing projects in C, and writing projects in Java, the class splits fairly evenly, so I expect to continue to accept projects in either language. I also note that student choice of OS on their laptops varies, which is ok with me, but does mean that if I want them to install any particular software, I need to keep that in mind. I can in theory go with Un*x/Linux only software given that all students have accounts on a University Solaris system, so I can make them ssh into it and work with a text based system, which the Windows types may find annoying, but I assert that it's good for them. As the syllabus has evolved, I've found myself adding lots of stuff that's not in Stevens (and Stevens isn't really a programming text anyway, so the Socket Programming API has always been supplemental material), and I've been using my own ordering of presentation, which has gradually improving. I have leaned on some lectures written by a prior instructor for subjects like queuing theory. The projects to date have been socket programming exercises. A prior instructor used one of the ns series of network simulators for exercises, but I haven't had a chance to really look at how I might integrate such a thing. From the perspective of ordering of material, I need to get them doing some socket programming early, so at least a cursory outline of TCP functionality needs to be done very soon after the course starts, to provide a framework for understanding why the Sockets API is what it is. If I add network simulator exercises, I may (depending on the ordering of course material) be able to delay the first socket programming exercise a little. So these are the sorts of things I have to consider as I go forward. Next posting will be a little bit about the 5 textbooks I have in hand right now. Tuesday, December 10. 2013Grace Murray Hopper
While I appreciate all the tributes to Rear Admiral Hopper on what would have been her 107th birthday, I'm disappointed in the superficiality of most of them. Usually they mention her naval career and COBOL, but that's about it. She was actually quite a bit more than that.
Before WWII, she earned a PHD in Mathematics from Yale and became a faculty member at Vassar. During the war, she joined the Naval Reserves and found herself assigned to work at Harvard on the Mark I, where she became one of the first programmers. After the war, she wished to remain in the Navy, but they declined because she was too old (38). She remained at Harvard until 1949, when she joined the team developing the UNIVAC I. It was while she was there that she developed the concepts of higher level languages and compilers (the first actual compiler was developed by Hopper, 1952.) She returned to active duty with the Navy in the 60s, and retired and unretired several times until her final retirement in 1986. She was a strong advocate in the Navy for a move to smaller, distributed, networked computers instead of large centralized ones. Her career spanned academic, commercial and military worlds, and she was very accomplished in all of them. Just referring to her involvement in COBOL isn't giving her nearly enough credit. And I think she really liked being in the Navy. Tuesday, September 17. 2013the problem with certificates
updated below - 2013-09-17
What is a Certificate and what is a Certificate Authority?How does a web site prove who they say they are?The short answer is they go to a Certificate Authority and purchase, after supplying some proofs, a certificate that says who they are. The long answer is that much of the cryptography infrastructure of the internet depends on the Public Key Infrastructure (PK), which is based on the X.509 standard for digital certificates. These certificates provide a framework for "proving" that an actor is who they represent themselves to be, and for negotiating the encryption to be used for a secure connection. There are two groupings of certificates, self-signed certificates and certificates derived from Certificate Authorities (CAs). CAs can be public entities (you can get "store bought" certificates from Verisign, Geotrust, or whoever) or private ones. if the latter, they aren't much different from self-signed certificates. Network applications that use certificates typically have a list of CAs for whom they will accept certificates, and you can add CAs or individual certificates in some cases if you decide to trust them. When your browser sets up an https: connection to an online store, these certificates are in play. this is how you, at least in theory, know that sears.com is really Sears and not some joker playing at being Sears in the interest of stealing credit card numbers. This makes CAs a target and a very attractive one, at that. If you can compromise a CA, you may be in a position to create certificates that allow you to pretend to be someone else, which permits all sorts of nasty attacks. And CAs have indeed been compromised, and at least one that i'm aware of has had to shut down because of the extent of the compromise. You see, once a CA has been compromised, the trust relationship is broken - you cannot distinguish between the "real" and the "fake" certificates, and the game is over. This is where it gets terrifyingWe know that the government, under the cover of various and sundry security provisions, has sent orders to outfits like Lavabit, essentially ordering them to operate fraudulently by continuing to sell a secure service while secretly supplying "secured" data back to the government. This is a huge ethical dilemma, which the owner of Lavabit dealt with ethically by simply shutting down his business. The ethics of any ensuing government prosecution are left for the reader's consideration. Should we assume that we know about all the orders in play? Of course not, for every ethical business owner there are no doubt other corporate entities who have chosen differently. So at this moment, can we trust any US based Certificate Authorities? I think the answer is a resounding no. And this means that anything that depends on certificates from any US based CA, whether it's a secure website or an IPSec based Virtual Private Network, is no longer trustworthy. And that should terrify us all. So what can we trust?We can trust private/public key systems that don't depend on X.509, like PGP/GPG. These work using the concept of web-of-trust, where you agree that you know that someone is who they say they are and accept their key. In theory, self-signed certificates are now more reliable if you can verify the identity of the signer. This is in essence a variation on web-of-trust. The problem with endpointsThis still doesn't mean much if the security end points are compromised. The NSA can do that, but it's expensive so they're more likely to do things like attack the Certificate Infrastructure. But your PGP/GPG key won't mean much once they hack into your PC or Mac and install a keylogger. But I've depressed everyone enough for one day. Update - 2013-09-17FYI, I am now using OpenPGP to sign everything sent from my primary email account. The fingerprint is 3133 3F6D AB20 AC3F 9C88 DC61 0F2C 74F4 7012 C7FA, short id is 7012C7FA, keyserver is hkp://keys.gnupg.net. it's a 4096 bit RSA key. cheers! Saturday, February 16. 2013lessons from my recent macbook outage
when you add an external drive to your mac laptop as a backup device, go with two partitions, one for time machine backup, and one for a bootable copy of mac os x. put a copy of the mac os x installer in the bootable mac os x partition. when you do need to restore you can boot from the external drive, reinstall on your new or newly wiped internal drive and restore from the time machine backup.
if you go with this setup, you also have the option of restoring your backup onto the external drive while you wait for the replacement hard drive to arrive from your vendor of choice. it won't be much of a laptop if you have to lug an external drive around, but at least you'll be on the air. don't attempt to defrag in place. wipe and reinstall from backup. it was defragging in place that put me into the 36 hour rebuild from hell. Friday, January 6. 2012new adventures in Lame Customer Service, part II
Apple generally tends to be good at customer service, it's part of the whole user experience thing which has helped bring Apple back from the dead. this fact makes my recent trip to the Genius Bar all the more disappointing.
my daughter was using her mother's macbook to watch a DVD while on vacation, and let the battery run all the way down. it appears that the battery was/is in the process of failing, and the laptop's timers/counters for tracking battery status weren't in sync with the actual state of the battery. the result was a very hard shutdown, and something got corrupted - like maybe the binary of finder, based on the subsequent behavior of the macbook. now at this point, i don't actually know that the battery is the problem. once we got things charged back up, boots were painfully slow, and logins failed partway through -- finder appeared to launch then crash. we were on vacation, so i didn't have the install dvd. two days ago, back at home, i tried to boot from the install disk only to discover it had the exact same behavior as my own macbook pro had displayed when the pro's battery was dead. note that i haven't put two-and-two together yet, i haven't yet become aware that defunct laptop batteries are the common issue. so i figure to make an appointment at the genius bar at the local apple store, but i go online and discover they apparently want me to try phone support first. that's ok, the laptop is still covered by applecare, so i request support and wait for their call. they call back fairly soon, and it turns out that they're stumped by the problem two. we spend a lot of time on it, then get a genius bar appointment set up for 2pm the next day (yesterday as i type this). i show up slightly early for the appointment, figuring to go check in right away and then kill time, except now the customer service starts to decay a little (it was fine up to now.) the guy doing the check in is no where to be found. after 5 or so minutes, he finally shows up. i get checked in, and finally someone is assigned to me at about 2:20. the guy is fairly knowledgeable. the diagnostics show a bad battery, but we agree that it's plugged in, so stuff should work. we also agree that the not booting from dvd is something that needs to be addressed, and plug in a USB drive with the install images. that works fine, and we run an install-and-archive to replace the broken copy of Leopard with a good one while preserving all the personal files and applications. it's going to take a while, so the apple guy goes off to handle another customer while the install happens. i presume after the install is verified as successful, he'll be back to look at the can't-boot-from-dvd problem. except he never comes back. the install-and-archive finishes, logins work properly. i see that he's handling a customer with a macbook of some sort that needs to be sent to the depot, i figure that can take a while. in the meantime i install a bunch of the updates to Leopard. this takes a while, and he still never comes back, clearly moving on to other customers (he actually headed behind the counter several times, carefully avoiding any eye contact with me.) finally, at 4pm i give up, and get one of the other employees to unplug the power brick from behind the counter so i can leave. so here are the issues:
i have subsequently figured out that macbooks and macbook pros with bad batteries may not boot from DVDs, this turns out to have been the case for both laptops. it's interesting to me that both apple phone support and the genius bar folks don't seem to know this. i'm 2 for 2 on it, but then small sample size does apply. new adventures in Lame Customer Service, part I
i bought a copy of Drive Genius 3 from Other World Computing when i upgraded the hard drive in my MacBook Pro a little while back. it's been a decent piece of software, but recently has been reporting a bad block. i was having trouble booting from DVD (necessary to make repairs to the internal hard drive), so i just suffered for a while. just the other day booting from DVD became possible again (it turns out that if the laptop battery is at the point of needing replacement, DVD boots will fail.)
once i realized that DVD boots worked again, i also realized that i'm not entirely sure where the DVD of Drive Genius is (my office is, perhaps, a bit messy.) so i looked around the Prosoft Website, and figured out that apparently there was a downloadable ISO, and i couldn't find the link for it. I emailed off to support, including my serial number, and was rather happy with the quick response, an email with a link. Except the link didn't work right. there was an input field for the serial number and a submit button. i entered the number and clicked submit, and got a confusing html jumble back. i tried the serial number without the hyphens, no change. ok, i'm using Camino, which is an obscure niche browser, so let's try Chrome. same results. ok, Drive Genius is a mac product, surely the link works in a current copy of Safari. nope. so i emailed back to support. they sent back a different link to try. except this one was going to charge me $7.50 to send me a physical disk via surface mail. no thanks, guys, i already paid for Drive Genius 3 and you advertise downloadable copies of the ISO. fix your web site (please.) Tuesday, November 9. 2010I remember the Grid
I just saw this on the BBC News rss feed: Laptop Designer Honored. It brings back memories.
Back when I was in grad school, the Grid computer had just appeared. An ex-roommate of mine had just gotten a job with notorious Beltway Bandits BDM, and they had a contract to support the development of modern Light Infantry concepts at Fort Lewis in Tacoma. There was a Un*x component to the project, and a real shortage of Un*x knowledge amongst the BDM personal, and my roommate thought of me (me at the time being RPI's first proper Un*x administrator.) So I spent a spring break in sunny Tacoma teaching BDM folks Un*x basics (No, actually it was sunny that week. I understand that this is not normal.) One aspect of the project was trying to deploy computing power in the field at the company level for the first time. That was where the Grid came in, it was the first computer to come along that even remotely seemed like it could be used that way. It was a cute little piece, I didn't get to spend much time with it, but it hinted at possibilities. These days, with netbooks and Macbook Airs and iPads and iPhones we don't think twice about mobile computing, but there was a time when it was merely a possibility, and the early 80s weren't really that long ago. Friday, July 16. 2010the joy of contracting
upon further consideration, i'm withdrawing this post. it may resurface in revised form at a later date.
(Page 1 of 2, totaling 17 entries)
» next page
|
Calendar
QuicksearchCategoriesBlog AdministrationPowered byright side networked blogs |