Categories
Software Engineering Tech and Culture

Working Remotely

Everyone is WFH – Working From Home these days. Even people that never were in favor of remote work for IT workers are now compelled to set up a special place in their homes for work to carry on uninterrupted. Just the other day, I got an IM from an ex-colleague, who vehemently opposed remote work, sort of lamenting to me as to how the entire world is now forced to work from home.

I started my professional career in the US in the IT consulting business. It wasn’t the ideal start to a career, mostly because when you’re working as a consultant, you don’t have an office — you go where the client goes. This also meant that I had a level of autonomy, especially considering my experience, that was not afforded to even seasoned permanent employees. For one, I could come in and leave the office anytime as long as I was able to commit to the project plan. This was just like being in grad school! It could also explain why I ultimately built a good framework around separating work-life from personal-life.

I still remember working remotely for about a month as I was recuperating from a foot surgery. It was quite a non-issue. My supervisor(s) had no problems with that even back in the day, and I was actually offered to apply for a preferential parking permit to avoid walking long distances whenever I returned.

Then, as I became more experienced, I was able to take on client calls from home or prioritize personal matters around professional work. It was all quite flexible and amazing. This was the business of consulting for IT departments to set up complex systems.

Even when I moved to pure software engineering (building products for end-users) as a consultant, I have memories of being allowed flexibility. Sadly, none of those companies are in business, still. It seemed that back in the day, even tech companies had a pretty good handle on work-life balance. Some of my colleagues even worked remotely on a full-time basis, just as long as they were able to negotiate it at some point. And a lot did. Surely, having geographically spread campuses with a more project-oriented focus helped. Perhaps it was also a cultural thing; who doesn’t remember going on long team lunches back then?!

And then, sometime around the early 10’s things changed. Agile and Scrum, in particular, were already on the rise, but now that companies were newer and budgets leaner, companies started trying to optimize their workplaces, both for usage as well as project management. Agile was the hot new buzzword. Gone were cubicles in exchange for ‘collaborative’ open spaces full of drawing boards, plants, and as little furniture as possible. My first open floor-plan office job was actually in 2010. Some people even shared a desk, if it were too big for one person.

This was also the time when, battered by the Great Recession, companies were trying to bring back a lot of operations that were outsourced until the late 00’s. People just have to be in the same room!

The thing is that none of the benefits of being in the same room supersede the efficiencies obtained by inculcating a more flexible and professional work ethic. A lot of this new-age ‘company is your family’ came about from Silicon Valley competing for talent by offering on-site catered lunches and laundromats and daycares and whatnot. Smaller startups were only able to preach the mantra of collaboration and agility. In all of this, though, the voice of many who wanted to focus mostly on productivity was silenced. Surely, they aren’t team-players and fully committed if they’d want to work from a beach, which is definitely what ‘working remotely’ implied!

In denser cities like Amsterdam and San Francisco, there was more incentive to build fancy office spaces — homes were smaller and didn’t provide enough separation between work and personal space. That, and most people are terrible at time management. The biggest complaint with remote work is that people don’t know when their work-day begins and when it ends. Either they’re working all the time, or can barely figure out a rhythm for working. Some people also miss the social aspect, although I am of the opinion that they mostly have customer-facing roles or have circular dependencies in order to contribute.

For the past 6 years, I have been working remotely. Interestingly, what should be natural for tech-workers is now seen as a privilege. I love it, and even though I wouldn’t call myself unsocial, I like that I get the change to pick between being social or being productive or both! When I started, I was actually in the midst of changing my lifestyle and getting healthier. I used to go on morning and lunch walks. I could take some calls and manage my emails while having a coffee with my newspaper. I could plan my lunch at an odd hour. All things that are relatively impossible in an office environment.

This brings me to a segue about the aforementioned colleague. I had a ton of discussions about his idea of a productive workspace. He wasn’t a full-time contributor, and only wanted to come to the office when he desired. His reason to want an office was also because he missed camaraderie at home. All very selfish reasons. And this has been my experience talking to a lot of opponents of remote work — they only hate it up until they have a personal need. At that point, privilege sets in. The whole teamwork seems to fall apart, both practically and theoretically. Think about it — how is it a team-building exercise if some people in your team would love to have some flexibility for a few days (for personal reasons) but your project planning requires them to be there in person for every working hour. That’s got to be demotivating.

And now that offices have *had* to close down, I feel that the empty promises of more collaborative workspaces are coming to the fore. Apple is releasing more stable updates, companies are still working and building, and people are still working hard. Some so much that they’re now complaining of corona-fatigue. Instead of countless meetings in a conference room, people are now tired of video calls. Things are normal. The only people disadvantaged seem to be those that prioritized proximity to the office to comfort when picking a place to live. Now they’re left with a tiny apartment in a wonderfully central location, but no workspace.

It is clear that working remotely is here to stay, at least for the foreseeable future. Until the virus situation is eased enough to allow executives to regularly work from an office, it is fair to assume that very few people in the non-essential IT workers category would like to subject themselves to viral exposure. That could take a few months, if not years. And none of that would slow down the progress of technology.

So, was I right in embracing and pushing for a remote-friendly autonomous culture at all my previous workplaces? You bet I was.

Categories
Software Engineering

On iPhone app inputs

This question statement often comes up.

We’re developing an iOS application, and of course, the user would have to input some personal information before he/she can start using the application. What kind of validations should be implement on the device? Which ones on the server?

Most people, especially when starting out on a new project, consider input validation to be a fairly trivial problem statement with fixed states. But, as the application matures, they fairly quickly realize that every input has its own set of error and valid states. The validation problem grows exponentially with the number of user inputs on the device, as every combination needs its own validation, more or less.

I firmly believe that validation should never be done on the client, unless that’s the only place you need to do it. If there’s even a single server side validation component in the project, the team is better off delegating it entirely to the servers. This makes life much more easier for the front end developers and also makes testing easier. If your application depends on a web service (as most iOS apps do), chances are that the web service would in the near future completely rewrite its required inputs specs and then you will find yourself in a situation where instead of adding new features to your iOS app, you are spending valuable time in trying to get all the client side validations in place. This, until the next time this repeats itself.

That said, there is also a need to think about whether that input is really required. Let’s not forget that iOS devices are mostly mobile gadgets where the users are generally in a hurry to complete a particular task. Even if it’s their first time using the application, forcing them to input personal data which is only partially required is detrimental to the entire user experience. It adds unnecessary validation.

Conversely, if you have an input component in your app, then you necessarily HAVE to validate it. If you’re not validating it, it’s not important, and hence, must be disposed off.

Lessons: Abide by the Apple Human Interface Guidelines, always. Don’t force your users to input more information than is logically necessary for the application to do its job. If you have to perform input validations, do them all at one place, and that place is the server where you have the necessary processing cycles.

Categories
Software Engineering

Should you use Interface Builder?

Ever since I started out with iPhone development, I have seen a lot of debate online about the pros and cons of using the Interface Builder tool that comes as a part of the developer toolkit for Mac and iPhone development. For a lot of people, the tool comes across as basic, limited in scope, and sometimes entirely useless. For the developers at the other extreme, IB is an invincible tool without which there is no app development.

This Wednesday, I got the chance to debate just this with the CTO of an iPhone startup here in Washington, DC. While I am short on the person’s technical background, he did mention that his previous programming experience was doing website development using Flash and Dreamweaver.

Categories
Software Engineering

iPhone development

Since 2008, I have been trying to find enough off time to get heavily involved in developing applications and services for the best phone I have ever used. This is a new category that, starting today, would be a reflection of my learnings and experiences.