He said “if you want a playground, go find another job”

Roughly six years ago while working at my previous employer, I asked one of the technology leaders, who might have had a bad day, why we don’t have a tech playground so that technologists like us can innovate. His answer was “if you want a playground, go find another job”. That statement both hit me hard and confused me for quiet some time but I stayed on with the company for several years, pushed hard for my ideas and and received support from other leaders that helped me showcase what innovation can do. I will not name those leaders but I know that they know who they are because they are good. I later thanked that very first person about what he said because he made me realize that there is no place better than where innovation is important but innovative playing is not enough. Planning, evaluating, timing, organizing, and teaming up are crucial for successful innovation. A year and half later at my current employer Thomson Reuters I witnessed a whole different type of leadership that sponsors and nurtures innovation as well as transformative thinking across all levels. #workingatTR is exciting and going back to the office on a Monday with fresh new ideas is even more exciting #innovation #leadership #business #technology

X is not Y and is surely not Z in the Digital Age

Witnessing over 20,000 women in tech attendees at #GHC18, majority of those by pure observation seem to be in their early twenties, hence Generation Z (those born after 1998), filling all lecture rooms about IT & AI, occupying career fair booths, and consumed with technology, internet, and mobiles phone, are all strong indications that Generation Z is most crucial generation to think of. Whether it is the old or the new generation of businesses, Generation Z is consumers, producers, and employees of technologist today and pretty soon will be employers as well – that’s if they are not there already because they are entrepreneurs as well.

In the last couple of years, we witnessed many companies talking about digital transformation – mobile-first, cloud-first, and everything digital. The era of digital became popular in the last two decades mostly because of the rise of social media and mobile; all thanks to the Generation Y or millennials, those born in the early 80s, who use technology for everything. The enterprises that are trying to transform into digital are run by older generations, the X born from the early-mid 1960s to the early 1980s or the Boomers born before the X. It remains to be seen if digital transformation by enterprises run by Generation X or Boomers at a time when millennials are maturing in the field and the younger Generation Z are not taking over the workplace would force more subsequent business transformations or even bigger bangs! But we need not get carried away with this. What we need is a right amount of customer analytics, market studies with explicit delineation between generations (Boomers, X, Y, & Z) in the researches’ demographics, as well as products and services targeting generation Z consumers.

So if I want to go back to the title of this article, “X is not Y and is surely not Z”, I would ask any of prior-than-generation-Z readers and myself to: “as generations of A to Y” make sure that you think as Z because there is nothing after Z!

Women in Tech

Cheers to Ada Lovelace, the first computer programmer, Grace Hopper, who pioneered computer programming with COBOL, and all the women in tech who run code, projects, teams, and companies. But cheers is not enough. We need more diversity and inclusion in the computing workforce. According to ncwt.org only 26% of computing workforce in 2017 were women, less than 5% were Asian, 3% were African American, and 1% were Hispanic. Only 17% of the 3.5 million computer-related job openings in 2026 will be filled by US computing bachelor students. Those numbers are
not great today but it is worrisome if they stay that way. I participated last year in a career fair at a nearby middle school next to my home in Dallas metroplex and noticed over 50% of the participants who attended my session about technology were girls. Many of them said they are curious into how computer works. Perfect answer! Curiosity for the techie is key! #GHC2018 is a great place for women in technology and for dads like me to encourage their daughters into the field. I will be helping our Thomson Reuters team at Grace Hoppe . Always happy to talk about technology and what we do #workingatTR. See you there.

I am I in AI

I am addressing this to starters and professionals in the field of machine learning and artificial intelligence. It is part of a talk I plan to do at the University of Texas in Dallas next month, so I would appreciate any feedback.

E.T. the ExtraTerrestrial
CC BY 3.0

The letters “A” and “I” are ubiquitous these days with the words “Artificial Intelligence.” But I want to change that for a minute and turn AI into “A person that is I.” “I” am the human with a biological brain, the intellect with emotions, the one who cares, the altruistic person, the one that errors, laughs, jumps, cries, fights, learns… the human being. If we take the word AI and describe it from a perspective of ourselves, A person that is I, then we may remember ourselves again instead of continuously looking outward towards artificial intelligence. If AI hype is about artificial intelligence, why cannot we think more of AI for things that matter to us, the human beings, the I?

When you, the human being, gets so excited about the latest technologies about machine learning, deep learning, reinforcement learning, Auto ML, and lots more, think less about the technology and think more about how you can use the technology to benefit your Is – the humans. What is the human problem that you want to solve with AI? What can you do with your learning to help improve the society around you? Yes, learning a lot of “A” technologies is cool, but, at some point, you won’t be able to catch up with every article on ArXiv, latest code repos on GitHub, latest conference proceedings, and the latest computer that you need. The breakthroughs in artificial intelligence are happening fast and are overwhelming sometimes. You may reach a point of disturbance because you won’t be able to catch up. Don’t push hard to catch up.

Spend more time about the “I” (you and us the humans) instead of “A” the artificial machine. Look at many of the problems that surround you every day and then find the machine learning tool that helps you solve it. Be that “I” in “AI”

Don’t Panic. Focus and Learn New Skills

Technical professionals are under pressure to learn something new for fear of missing out on the latest technologies. Non-technical professionals fear that technology and automation will take their roles away, so they are also under pressure to learn something new too. All for the same reasons – fear of missing out or losing their value. Fear takes over their minds. They begin to panic. So they open books, take courses online, or maybe join colleges again. The panic is still there because they are possibly sensing that the time to learn new things is probably too late. They started to have feelings of guilt and hopelessness – saying “why didn’t I learn this before?” Or “why did I pick this topic when there is now something new that I should have learned already.” Panic. Panic. What happens then? Sadness. Hopelessness. Anxiety. Guilt. All such negative emotions would take over their mind when what you should be focusings on positive development and learning. Yes, people multitask constantly, but it is not easy to be sad and stay focused at the same time. So what to do? Give up, no! Rush to learn every new thing out there? There is no time! What to do?

My advice to the reader who feels with what I am writing here is simple. 1) keep doing what you are doing AND 2) think of ONE problem that you want to solve and focus only on the skills necessary to help solve that problem. Organize your time between (1) the current job that you are good at and (2) the goal towards solving the problem. You will either find a new opportunity that you decide to take or you will come up with new ideas that feed into the current job that you are doing. In both cases, you will be in control, and you will have something different and real. Knowing that it is imperative but can be hard to focus on the problem that you want to solve, many times problems do not get resolved quickly or even at all. If that happens, then you go back to the same formula 1) try again with a new solution and 2) pick a new problem. I admit that this sounds like it is an infinite loop. It is but that is what learning is all about! That’s the way that you focus on solving problems and learning new things instead of wasting time panicking. Voila! We solved the panic problem!!

How to write a great research paper

Simon Peyton Jones from Microsoft has 7 simple steps

1 Don’t wait: write.

2 Identify your key idea.

3 Tell one story.

4 Nail your contributions to the mast.

5 Related work: later.

6 Put your readers first.

7 Listen to your readers.

Check the article and talk at Microsoft Academic program.

Other interesting talks by the same author include How to give a great research talk and How to write a great research proposal

Also Nando de Freitas, Ulrich Paquet, Stephan Gouws, Martin Arjovsky, and Kyunghyun provided their ideas at Deep Learning Indaba 2018 conference. Check their Slides.

Thanks to Sebastian Ruder awesome NLP newsletter for the material mentioned.

Not just the world is round but the technology world is round as well

If the world is round then so as the technology world is round as well. In the late seventies and eighties, the BASIC language and Assembly Language were hot things at the time. After that, people took on new technologies with the Internet and jumped on languages such as HTML, C, C++, Perl, Python, and JavaScript. Systems moved from 8bit to 64bit. Storage devices moved from cassette tape to the cloud. Typing a program onto the computer moved from copying lines of code from computer magazines onto the computer screen. Issuing the BASIC RUN command and witnessing the executing of the programs felt magic at the time. For those who didn’t live the 8bit era, the same feeling can be felt today when running a Docker script and witnessing your code and servers coming to life. 8bit programmers became the old school, while modern day software engineers and data scientists became the new school. If you ask a Generation Z programmer what Assembly Language is or what is BASIC he or she may not know. But somehow the wonders of the seventies and eighties seem to be returning. Great things never die just like Goonies never say die.

The eighties are coming back, and I feel that I am that child nerd again. Commodore 64 returned as C64 Mini and Sinclair Spectrum is now ZX Spectrum Next. I personally still play old 8bit games using emulators. You can also watch or listen to a lot of new retro computing episodes via podcasts or Netflix. Just yesterday I started watching WarGames the movie a million and one times.

BASIC and Assembly language are hot again. Assembly language made it to the top 15 languages according to TIOBE Index. Not long time ago I happened to meet a young person buying an assembly language book at Half Price book store. When I asked him why his answer was “I heard that is cool”. And today, I just saw that Google published WWWBasic, an implementation of BASIC that runs via JavaScript. You can do the following:

<!DOCTYPE html>
<script src="https://google.github.io/wwwbasic/wwwbasic.js"></script>
<script type="text/basic">
PRINT "Hello World!"
FOR i = 1 to 10
PRINT "Counting "; i

You can also import BASIC code as a Node.js module.

The world is round and the technology world is round again.

tricking the phone camera!

one of my favorite photo#WorkingatTR. ‘May look weird to you but let me explain. We once decided to mess up with technology. We circled ourselves with boards in a 360degrees and placed ourselves between the boards. Think of yourself standing in a center where big boards surround you anyway you turn. One of us then stood in the center, placed the mobile phone camera in panoramic mode, and began turning around himself while taking a panorama photo. What you see in the picture is a 2D+Time photo (Time from Back to the Future?) turning flat into a 2D image. While the photo is being taken, some of us would run and take different positions in order to get photographed twice in the same photo. Unfortunately, the speed of the camera is faster than our movements so we couldn’t make Einstein happy with the theory of relativity. Plus the camera was somewhat smart to recognize what is going in here – kidding, the panoramic photo shooting would stop right around the 360 degree shooting. The point with all this is I look like one of the interns and it makes me happy because it is all about energy, fun, collaboration, and love for technology and teamwork! Caleb Fung Sandilya Madiraju Samir Naqvi Shiqi (Katherine) Li Akhilesh Yeleswarapu

A Life in Reuters

Took forever to get here but I finally got hold of “A Life in Reuters”, the 1951 autobiography by Roderick Jones, managing director of Reuters from 1916 to 1941. Reuters the company later became part of Thomson Reuters, my employer. What I hope to find are references to the technology of news and journalism that occurred at that time. History is always important for future thinking!

August 9, 1982

August 9, 1982 Commodore released the Commodore 64. It took me close to 5 years of wanting it, asking for it, dreaming of it and then eventually saving for it (you can ready me story at https://lnkd.in/de-XWks). It fueled my passion along every other 8-bit machine. At the end of the day, the c64 and its predecessor that I cherished, the Vic20, have wired my technology mind and helped me build my career which I am thankful for. If only I still have my machines again! Cheers, Commodore 64!