The Download: A Recap of Future Technologies Conference (FTC) 2016 with Arjuna Chala

Arjuna Chala, Senior Director of Technology for HPCC Systems, and I sat down to discuss the Future Technologies Conference that took place in San Francisco on December 6-7, 2016.  Arjuna was a keynote speaker at the event and we discuss conference highlights and what he thinks these insights mean for the future of big data technology.

About FTC 2016: 

Three hundred of some of the world’s top minds in technology and computing convened week in San Francisco at the first Future Technologies Conference (FTC) 2016. The well attended, two-day event, held on December 6-7, featured six keynotes, five project demonstrations and 190 paper presentations organized into thirty sessions. Conference themes ranged from theories to applications, reflecting a wide spectrum coverage of computer science. Conference topics included: Artificial Intelligence, Machine Vision, Communications, Security, e-Learning, Smart Cities, Cloud Computing and Robotics.

Key questions of the podcast include:

:15- The Future Technologies Conference was recently held in San Francisco. Arjuna, you were the chair of the event. Can you tell us a little bit about that conference?

  • Arjuna explains how the conference catered to the academic world more than the business enterprise world, with many interesting and deep knowledge topics. People attended from around 50 countries worldwide. The breadth and the width of the knowledge of the presenters was amazing and stunning.

 :50- There was such a wide variety of topics presented. Were there any specific talks or topics that were interesting to you?

  • Several topics were very interesting and directly applicable to work we do at LexisNexis Risk Solutions, including the IOT topic and driverless vehicles or autonomous vehicles. Progress made at the university level was very advanced and insightful.

1:38- Tell us a little bit about the autonomous vehicles. Was there a specific topic that was more interesting?

  • Presenters discussed research and challenges with autonomous vehicles and well beyond into flying vehicles and drones.  For instance, presenters discussed research on how autonomous vehicle technology can be used in flight safety to help pilots navigate to safety.  Research such as this could help save people’s lives during emergencies such as engine loss or severe weather impacts.  There is tremendous progress being made.

4:05- How about Internet of Things (IoT) and its impact on big data?

  • There were at least ten presenters on IoT, most of them discussing how IoT would be applicable in theory as well as how to solve issues related to large volumes of data which can come in through IoT systems.  Arjuna discusses one particular presentation which focused on pushing data to smartphones and then distributing the push technology across a wide area network resulting in a distributed network of systems which interact with each other in a smart way.

5:45- How are HPCC Systems customers using IoT today?

  • Arjuna discusses a few use cases of note where HPCC Systems is used in IoT solutions. One implementation involves telematics applications where IoT related hardware in the car gathers the information for general and driver behavior analytics. Another case looks at how IoT can be applicable to the safety of people working on factory floors. Arjuna believes next generation IoT solutions will be more challenging, because of the concentration of real time data and the speed of response needed.

6:53- There was also quite a bit about human to machine interfaces. What stood out?

  • The topics and conversations about human to machine interfaces were very impactful, especially those relating to healthcare. One presentation was about a research project looking at interfacing the human brain with robotics to yield almost natural movement. Arjuna describes how these robotic interfaces and sensors on robotic devices enabled people with spinal cord injuries to gain movement and how yoga enabled some participants to gain greater control of robotic objects.

9:37- There were a lot of really interesting industry demos with startups. Were there any that were particularly interesting to you?

  • Arjuna describes two demos that he found the most interesting. One focused on search technologies using artificial intelligence and neural networks. These technologies enable a person to utilize almost natural English language to perform searches. Another demonstration utilized big data visualization techniques on a large scale. The technology is evolving to multi cores and GPUs. The team that developed the demonstration utilize GPU technology to create a very efficient end-to-end solution with all of the analytics performed on GPUs and CPUs and all the memory is on either the GPUs or the CPUs. The visual interface on this demonstration was very impressive.

13:37- What trends do you see for 2017?

  • Arjuna sees future trends mainly focused on customer needs.  For instance, autonomous vehicle technologies such as AI software or other software could provide Uber-like customer services in a driverless car or driverless vans or drones used for delivery of goods given accuracy and efficiencies.  Technologies will continue to develop but humans will also stay involved.  Arjuna also foresees more smart, small devices for business use.  He foresees more applications developed for the enterprise that will cater to the mobile, smart device market including anything from big data visualization software to enterprise applications such as SAP.

17:15- In your keynote, you spoke about smart devices and the application to healthcare?

  • Similar to the way that technology is being used to base auto insurance rates by tracking driving behavior in cars, Arjuna believes the same risk analytics will be used for healthcare. Wearables can monitor inputs such as the daily level of activity, pulse, blood pressure, and other health indicators.   Arjuna believes that is a starting point for future behavior analysis that moves beyond historical medical records for risk assessment.

For more information:

Listen to the podcast on SoundCloud, iTunes, or Google Play

Watch Arjuna’s presentation at the Future Technologies Conference:

Prefer to read the transcript?  Below is the full podcast transcript:

Jessica Lorti: Thank you for joining us for the HPCC Systems podcast, the download. This is Jessica Lorti. I’m here with Arjuna Chala, and we’re talking about the future technologies conference that was recently held in San Francisco. Arjuna, you were the chair. Can you tell us a little bit about that conference?

Arjuna Chala: Hi everybody. Yeah, absolutely. The conference really catered to the academic world more than the business enterprise world, but that said there were [00:00:30] some very interesting and deep knowledge topics that were discussed across the board. It was very interesting to meet people from around 50 countries worldwide. Obviously, there was some challenges around communication, but once you get to know the people and the breadth and the width of the knowledge that they had, it was really amazing and stunning in many cases.

Jessica Lorti: Yeah, I was amazed just looking through the agenda. The topics that they had [00:01:00] were fantastic. Are there any specific talks or topics that were really interesting to you?

Arjuna Chala: Yeah, there were several topics I think that are very interesting, from the perspective of what we do on a daily basis. Obviously, the IoT topic, and driverless vehicles or autonomous vehicles is something that they’ve been working on at the university level in many cases as research projects. The progress that they have made is amazing. Also some of the assumptions that we [00:01:30] made, as far as when we are developing our solutions, kind of proved out with the research that they are doing. At the same time, we also learned some new things that we had not known before, listening to some of the conversations they were having.

Jessica Lorti: Tell us a little bit about the autonomous vehicles. Was there a specific topic?

Arjuna Chala: With regard to autonomous vehicles, their research really was not [00:02:00] specifically around the cars. It also expanded to planes, drones, et cetera. One of the key challenges obviously, like everybody else they had, was around the regulation and all applicable regulation, and when and where autonomous cars would be applicable, and how the regulations would apply to this autonomous car. Also, there were fears around security, from the perspective of somebody [00:02:30] would hack into these systems, what then? The general impression is they’re making a lot of progress from the perspective of developing newer technologies with regard to autonomous vehicles. Even though we already know that some of these technologies are already on the road with regard to the Tesla cars, or the Google cars for that matter. They’re making more progress with regard to flying vehicles like airplanes as well as drones. [00:03:00] One of the things that came up was the safety around flying planes, and if there was an issue with something going wrong when the plane was in flight, how autonomous vehicle technology would help the pilots or the other people on board to quickly be able to navigate to safety. Which was I think was one of the key things that came out of the topic. Typically, we think of autonomous vehicles [00:03:30] as making everybody’s life just easier, but when it was applied to safety and especially safety with regard to saving peoples’ lives on planes when something goes wrong, whether weather conditions were bad, or you lost an engine on the plane, how do you get to the nearest airport? Those things were talked about in that and it was really very interesting.

Jessica Lorti: [00:04:00] That’s really neat. I hadn’t realized that they were looking at autonomous vehicles for more than just cars. That’s what we hear about all the time. All sorts of vehicles. All right, how about IoT? There were a ton of topics on IoT, and that’s clearly a big interest for big data.

Arjuna Chala: I was actually amazed with the presentation in fact. One of the sessions, I think, there were about 10 speakers in the sessions, and all of them related to IoT research.[00:04:30] Either they were talking about some specific IoT product or solution that they had developed, or they were talking about the theory behind some of the IoT solutions that they had. Interesting, and nevertheless some of it was kind of hard for me to understand, but I kept going. [00:05:00] Quite a lot of it revolved around how IoT would be applicable in theory, right? As well as how they’re able to solve some of the problems with the large amount of volume of data coming into IoT systems. One of the talks revolved around pushing data into smartphones, and distributing this push technology across a wide area network. Instead of thinking of it as one single central system where all the data has to come in [00:05:30] and reside, they are talking about it as more of a distributed network of systems, which interact with each other in a smart way, so that all the data does not have to go to and fro. Because the data can be quite voluminous if you think about gathering everything into a central system.

Jessica Lorti: HPCC Systems, we clearly have several customers using IoT. [00:06:00]

Arjuna Chala: Yeah. I mean, so far as IoT customers is concerned, we do have a few customers. Either really get us towards the telematics applications kind of a field, where you have IoT related hardware in your car, and we gather the information from there and send it across to a central system for analytics and driver behavior analytics. Or, in another case it’s more around seeing how IoT can be applicable to safety of people [00:06:30] working on factory floors. We’ve been working on both use cases successfully, but I think the next generation of IoT solutions is going to be much more challenging, just because of the real time concentration and the need for responding very quickly.[00:07:00]

Jessica Lorti: Perfect. There was also quite a bit about human to machine interfaces. That’s something we’re also very interested in, right?

Arjuna Chala: Yes. I do not know how it will be applicable to us at LexisNexis, but the topic and conversations that were happening on the human to machine interfaces were very impactful. Especially as it related to healthcare. They were talking about how disabled people, and also people who had the misfortune of being in an accident and had a spinal cord injury, and they could not use their limbs as before,[00:07:30] how do you help these people to regain almost, you cannot say natural movement, but at least movement that was acceptable? One of the things that they were doing is they were looking at interfacing the brain, the human brain, with robotics. Robotics could be a complete robot, or it could be just a simple arm-like movement, or a throwing movement. The cool part of all this was how [00:08:00] they use the brain. The guy was just sitting in the room and he was not doing any movement himself, but he was using his brain to actually communicate to the sensors on all these other devices. The devices perform pretty much what he wanted it to do. If he’d say, “Move, move right,” the arm would move to the right. If he say, “Move left,” the arm would move to the left. If he would say, “Pick up that ball and from the top,” it would pick up the ball from the top. It is not that he’s saying it, but he’s thinking and controlling it [00:08:30] through his mind. One of the key things that came out of this thing was they had an experiment as part of the research study, where they talked to people who performed yoga on a daily basis. They collected two teams, one who did yoga on a daily basis for a year, and another team which really did [00:09:00] not do yoga at all. Each team had 20 people, and they wanted to see how they were able to control their mind, use their control of the mind to actually control these robotic objects. 19 out of the 20 people who did yoga were able to easily control the robotic arms, whereas the guys in the other group, hardly a few were able to control the robotic arm. It clearly indicated that there was some relation [00:09:30] to yoga, and how they have the ability to control these robotic arms.

Jessica Lorti: Lesson for the day, mental focus with yoga.

Arjuna Chala: Absolutely.

Jessica Lorti: There were a lot of really interesting industry demos with startups. Were there any that were particularly interesting to you?

Arjuna Chala: There were actually … Yes. There were about three demos from the industry from what I can recall. Two of them were extremely interesting. One of them [00:10:00] really catered to search technologies, and search technology is something that we do on a daily basis across the board. Especially the legal world, as well as in the scientific world, I think what this company had done would be very applicable. The key differentiator I thought was the use of artificial intelligence as it relates to neural networks to perform these searches. They could literally [00:10:30] understand what the person was trying to ask for. Literally, in almost English language like terms. Then take that and led to actual search information in these documents, almost precisely. It was a very impressive demonstration. The other demonstration was around visualization techniques. Again, this is on large scale, big data visualization techniques. As we know, the [00:11:00] technology’s evolving to multi cores and GPUs from a perspective of new computers, and the team that developed this new visualization technology was basing it off using GPU technology very efficiently. In the sense if GPUs were available, they were able to load a subset of the data that is being analyzed onto the GPUs for processing. Literally it was a column of stored database that they had custom created for this purpose, and all the analytics was performed on [00:11:30] GPUs and CPUs, and all the memory are either on the GPUs or the CPUs. Optimized through their algorithms, as far as processing is concerned. On top of that data, very impressive visual interface as well, to go with it. The cool thing about it is it’s not one part of the solution, they had an end to end solution. [00:12:00] Mostly using GPU technology to cater to the needs, it was very impressive.

Jessica Lorti: I personally spoke with people that had amazing backgrounds and were working on amazing things. Were there any people that really stuck out to you?

Arjuna Chala: To me, the main people that really stuck out were the guys who had some knowledge [00:12:30] about the industry. Again, I go back to the search technology person whom we had interacted with before when he demoed the system. He was a fresh academic, in the sense that they’d graduated recently. Their thinking was always being how to solve real user problems, right? I think some of them are the ones that actually separated themselves from the acts of pure academic crowd. No doubt, the pure academic crowd,[00:13:00] there’s a lot of theory and a lot of passion behind the theory as well, in the way they research the papers. All of their papers and executed their papers. At the same time, my interest was really around how their solutions were progressing towards solving something that end users need. That search technology person was impressive, as well as the topic on visualizations, [00:13:00] as well as on the IoT side.

Jessica Lorti: We saw all these wonderful technologies and the things that the academics are working on, and they’re starting to adopt within some of the industries. That leads us to 2017 in trends. What do you think’s going to happen in 2017? [00:14:00]

Arjuna Chala: 2017 I think obviously there’s a lot of user needs. As we already know, autonomous cars is going to become a reality from what Tesla has indicated, GM has indicated, and all these other cars are on the verge of … Even bringing in autonomous car services where literally there’s no driver, but they provide Uber-like services to you. That is going to definitely impact software development practices, and I think from that perspective, newer technologies is [00:14:30] definitely going to be a play. Whether AI software, or software that more closely integrates with autonomous technologies. That said, there’s other fields as well. Drones I think is going to be huge. I see drones being investigated as alternative tools for delivery of goods. You probably will see an autonomous car come to your house. Autonomous van, equal to like your FedEx or UPS van. [00:15:00] [00:15:00] Maybe a drone will actually do the delivery, there won’t be any driver be in the seats, a drone will pick up the package and put it in front of your house. I’ve read articles around people experimenting with this, mainly because drones and also machines do not make mistakes, and they’re on time, most of the time. Obviously, it saves inefficiency. Again, there is a huge debate. Would they replace [00:15:30] humans completely, or not? Again, it’s a good debate from the perspective of obviously technology’s technology, we need to make sure it keeps going to the next level, but there’s always some things that humans will also be able to provide as well.

The other thing that I see now, coming back to more basic as far as trends is concerned, one of the neglected areas that [00:16:00] I see is enterprise software neglecting the potential of what I call as smart devices. When I say smart devices, it’s phones, also your smartphones, your smart watches, your phablets, tablets, et cetera. It’s more smart devices, basically small devices that are smart. Business and enterprise software really are not catering to this need on the smart devices. One thing you know, you see very few applications that work on smart devices at [00:16:30] the enterprise level. I think in the next year, more and more, you’ll see people just leveraging smart devices for everything. I don’t think that you’re going to see businesspeople carrying laptops and going around. At the maximum they’re going to be carrying a tablet with them, but in most cases they’ll probably rely on their cell phones. If that is the trend, then you better develop applications that cater to this industry. Whether it’s a big data analytics visualization software, or [00:17:00] if it is your enterprise application like SAP or enterprise software application. All of them are going to be more catering towards the smart device technologies.

Jessica Lorti: In your keynote, you spoke about smart devices and the application to healthcare.

Arjuna Chala: Oh, yeah. Probably kind of scared a lot of people. Again, the trend is moving [00:17:30] towards where future behavior is going to be used to perform analytics of the risk. For example if I have to find out what is the risk of providing insurance to a patient, instead of looking at his previous health history, we can motivate the patient to actually benefit from good behavior in the future, by encouraging [00:18:00] them to perform exercises regularly, or cater to a good eating habits. The key is how do we monitor all this? Obviously nowadays we have wearables that can monitor your … Probably not your diet, but it can pretty much monitor daily level of activity, your pulse, blood pressure, et cetera. I think that is a starting point [00:18:30] in which where we can go into more future behavior analysis for risk, versus looking at just historic information on what the patient’s historic medical records were. I think this is going to change a lot of industries. Healthcare is one of them. The same thing is already changing the car insurance industry. If you look at it, there’s already applications on form which track your driving behavior, and based on your good driving behavior, your insurance premium is marked up. The same things are bound to happen in the healthcare industry as well.

Jessica Lorti: Thank you Arjuna, I appreciate you taking the time with me today.

Arjuna Chala: Thank you Jessica.