The Future of Work – Amazing new technologies and Citrix Startup Accelerator (draft)

I recently attended the O’Reilly Solid Convention. it was very very cool, and right at the heart of current enthusiasm for ‘makers’, for IoT, and for hackery of all types. If you’re interested, go ahead and check out some of their videos. I’ll still be here.

One talk I found particularly inspiring introduced a new device designed to use emotive technology to very simply increase alertness or relaxation – the founder of doppel talked about some of the background to their work, exploring phenomena like the rubber hand illusion, and how this can be generalized with digital hands, and explored a range of doppelother cognitive illusions. The end result of their exploration was a very down to earth wearable device that provides a heartbeat like pulse that produces statistically significant changes in human physiology and performance. Unfortunately there is no video of this talk, but it is worth taking a look at their kickstarter.

I’m fascinated by these types of cognitive hacks, imagine how phenomena like this could enhance virtual reality, or make virtual meetings more productive. There’s amazing new innovation out there around giving new senses – such as seeing with the to2020 landscape docngue (now FDA approved), or explorations into adding a directional sense. The idea of using technology to make us stronger, faster, and more productive is introduced in the Citrix 2020 technology landscape document and is a one element of how our workplaces will change in the near future.

Another rapidly emerging area is the use of voice recognition. I’ve been using the Amazon Echo at home, and it’s amazing how quickly even the most technophobic of my family are willing to use this device. ‘Alexa, play <a random annoying teen pop artist> radio’. It’s fast, easy, and dramatically reduces barriers to use.

As of today, the types of commands provided natively by Echo are relatively simple. Yet even in challenging situations Echo can hear and understand requests so well that it feels like magic. Imagine having this technology in the workplace – in every meeting room. No more having to work out terrible user interface around different display options and more.

Check out a demo of this in action at Citrix Synergy 2015 with the Citrix Workplace Hub and Octoblu.

What about having meeting rooms that can identify who’s physically in a meeting, the discussion themes, if people are upset, or indeed foster better collaboration. Imagine having the Jarvis virtual assistant from ‘Iron Man’ at your work. Or, if thats not your scene, what else might be coming soon? How about instant hardware prototyping? What new IoT devices might make sense in the workplace? What if we could trial the hardware as simply and easily as software?

Future of workOne of the themes for Citrix Startup Accelerator is the ‘future of work’. The goal for this investment theme is to invest in first class startups, bringing new approaches and capabilities to these emerging themes in how our workplaces become more productive, and more human-centric. One example from our portfolio is WhoKnows. WhoKnows is an amazing company bringing very pragmatic improvements to the way we understand the rest of our team, and indeed everyone in the organization. I like to think of them as making the whole of a large company as simple to work with as a single workgroup. This is a huge challenge and one that existing approaches have not yet solved.

Do you have a great new approach to ‘The future of work’? Let me know.

Michael

Posted in Uncategorized | Leave a comment

Emerging Technologies conference (EmTech 2014)

emtech header 2

I had the good fortune to attend and speak at the 2014 MIT EmTech conference. I was particularly struck by Astro Teller and Yoky Matusoka talks. There were also awesome content on robotics, climate issues, hacking the mind, etc.

Here’s a link to videos of all presentations. http://www.technologyreview.com/emtech/14/video/

Astro Teller – Google [x]

  • GoogleX has a fluid structure and shared resources. Teller effectively functions as a board member for all projects.
  • Why is 10x the right measurement?
    • Get away from incremental thinking
    • “If just looking for a 10% improvement, engineers will start by improving the current <car>”
    • Radical requirement forces dropping assumptions
  • Philosophy of moonshot thinking
    • Enormous problem that can be named
    • Radical solution – if solution is clear/straightforward/well understood – nice, but not the ethos of their culture
    • Based on science and technology – and needing a breakthrough
  • Interesting – they have marketing in the group, but called by a different name … “Head of getting moonshots ready for the outside world”… can’t have “marketing” as it ‘scares off the innovators’ – [MH – Contrast this to ‘lean startup’ mindset, where marketing & customer focus, is front and center]
  • Projects are picked through a process that depends on Larry and Sergey intuition.
  • Google glass is not just a computer for your face – the real calling of wearables in general is to get out of your way – in exchange for being on your face needs to just work – no UI.
    • e.g. Teller’s life is ‘leveled up’ when glasses on and ‘leveled down’ when not – how to help digital world and physical world work without the schism. New Google glass leader is trained to be sensitive to getting technology out of the way using technology rather than thinking of technology as a benefit of itself alone.
  • Solve for X
    • More people should be doing Google X like things
    • Solve for X is a distilled version of this – find something that would make the world a radically better place – lots of incremental users
    • Originally thought there might be a pipeline problem – initially pushing hard for a conference to drink from firehose of new idea – but this turns out to not be the challenge. Actually want to do this, important for the world.
    • Structuring of molecules – hold same amount of gas at 1/4 of the pressure
    • Nick Negroponte – beam power to spaceship as it lifts off – capture and convert the heat
  • Project Loon – next year or so there will be a semi permanent circle of balloons in southern hemisphere

Michael Commentary:

** Very cool – sufficiently deep pockets to be able to solve the hard things, then worry about customers/market dynamics later.

Yoky Matsuoka – NEST (Google)

  • Started with a big problem – but one that can be embodied in a consumer product – easy to use and enriches your life
  • Background
    • Robots and neuroscience intersection – understand more of how human brain works using robotics tech
    • Use neuro understanding to build the right robots
    • See if robotic thecnoogy can help people with neuro problems
    • Created the center for Sensorimotor neural engineering
    • Relationship between device and human – tech can understand people – tech can do too much (people don’t learn) – too little not helpful — Ying Yang. Yokyworks engineering for the human experience
  • NEST
    • Solving big problems – solve problems that consumer apps …
    • 50% of domestic energy use is from heating and cooling
    • People are not good at doing this – potential to save at least 20% of the energy
    • 3 of 5 deaths happen in homes without working smoke alarms
    • Advanced technology in a beautiful package (otherwise known as the inner geek)
    • Saved over 2 billion kw hours – at least as compared with just running with fixed thermostat setting
    • Growing upsides from connectivity – collaborations
      • With Mercedes: The car knows when people are getting home
      • With Whirlpool – reduce noise by running only when people are away, run the fluff cycle when about to be home
    • Lessons
      • Continuing to learn about customers and deployment environment is critical
      • Initial assumption: if someone has purchased the nest, they will want to save power – hence a heavy focus on learning when to turn off the system.
      • However, for many buyers, the appeal is not energy saving, as much as beauty and adaptation to different household needs. Hence aggressive power saving was making them unhappy – NEST quickly adapted to a more refined approach.
      • Biggest surprise about how people actually use these devices – people touch the thermostat all the time. 1.6 touches per day.
    • Motivations: Matsuoka has an amazing background – the question came up for why she chose to do a thermostat. Turns out that some of her more forward thinking approaches did not gain market acceptance (robots for rehabilitation in the home, etc). NEST was the obvious and necessary next step as a way to gain acceptance for sophisticated devices in the home.

Michael Commentary:

** Nest is an amazing success story, but the main lesson here is around market dynamics, and the success Matsuoka had in identifying a necessary and viable innovation stepping stone.

Posted in Uncategorized | Leave a comment

Innovation and startups – a simple manifesto

dynamic

Innovation is a dance between culture and technology. A matching of what’s possible with the magic of adoption of new things. It’s no good creating a technology only to discover that there’s nowhere for it to go. We have few chances to influence what will be, so those that we have should be treasured.

In 2011 I moved with my family from Australia to California to set up Citrix Startup Accelerator. The underlying idea is that we can be more effective at inventing the future, and seeing what’s coming next, not just by running research in house, but also by aligning with the ‘innovation machines’ of Silicon Valley and other global entrepreneurial communities.

Posted in Uncategorized | Leave a comment

Fortune at the bottom of the pyramid – too good to be true?

I’m a bit behind the story here, but just came across this paper (from 2007) on the Kauffman.org site claiming that selling to the bottom of the pyramid isn’t a win (in most cases). Worth a read if, like me, you’re behind the story on this one …

The popular ‘bottom of the pyramid’ (BOP) proposition argues that large companies can make a fortune by selling to poor people and simultaneously help eradicate poverty. While a few market opportunities do exist, the market at the BOP is generally too small monetarily to be very profitable for most multinationals. At the same time, the private sector can play a key role in poverty alleviation by viewing the poor as producers, and emphasize buying from them, rather than selling to them.”


Fortune at the Bottom of the Pyramid: A Mirage How the private sector can help alleviate poverty

Aneel Karnani Stephen M. Ross School of Business at the University of Michigan

E-mail: akarnani@umich.edu

April 2007

http://papers.ssrn.com/sol3/papers.cfm?abstract_id=914518 

 

Posted in Uncategorized | Leave a comment

The Future Of Mobile – one slide from SAI deck

Media_httpstatic5busi_yhape

Posted in Uncategorized | Leave a comment

The Free Universal Construction Kit | F.A.T.

Kit_6469_universal_50q

F.A.T. Lab and Sy-Lab are pleased to present the Free Universal Construction Kit: a matrix of nearly 80 adapter bricks that enable complete interoperability between ten* popular children’s construction toys. By allowing any piece to join to any other, the Kit encourages totally new forms of intercourse between otherwise closed systems—enabling radically hybrid constructive play, the creation of previously impossible designs, and ultimately, more creative opportunities for kids. As with other grassroots interoperability remedies, the Free Universal Construction Kit implements proprietary protocols in order to provide a public service unmet—or unmeetable—by corporate interests.

By itself this is a massively cool exercise, but brings up all sorts of interesting philosophical questions about using technologies of all stripes, together, in shocking mix and match rainbows of competence and otherwise. It’s not so far from innovation in the large, the idea that bringing together different domains of expertise results in all sorts of interesting new ideas, or from the tremendous enthusiasm for startups and indeed any type of ‘making’ that happens again and again in SV and in special parts of the world.

In this case, combining toys with different construction ‘affordances’ makes for a much more flexible, more universal, construction capability. It’s also super cool in that it makes the capabilities of 3d printing ever more clear in terms of bridging into a new world of IP concerns, capabilities and freedoms.

For more on the Universal construction kit see http://fffff.at/free-universal-construction-kit/
For an interesting riff on makers and the emerging world of 3d see http://venturebeat.com/2012/03/21/dylans-desk-when-craftsmanship-meets-tech-m…

Posted in Uncategorized | Leave a comment

Cloud Robotics Hackathon – Notes and thoughts

Last weekend Andra organized a very cool robotics hackathon at the Citrix Startup Accelerator as part of the Cloud Robotics Hackathon. Apart from being ground breaking, it was a heck of a lot of fun. Here’s some images and videos of the event on the Google+ for RobotLaunchpad.

In this blog I wanted to capture the approach that Guy Bieber and I came down to with the supplied RobotShop Rover with ultimately a bluetooth module and Andra’s HTC Android phone. This is intended to be a quick capture of approach and code to help others who use similar kits.

Photo-2

This was a fun weekend and way to explore the near future of these types of devices and using them with cloud functionality. I’m looking forward to much more from commodity robots linked in with SmartPhones and the cloud. This was a good way to make the Internet of Things more visceral.

__

The goal of the exercise was to build a useful tool with a combination of the rover and the myrobots.com site for robot connectivity. We had discussed a bunch of possible outcomes including:

  • A robot that responds to (online visible) sports outcomes by dancing happy or sad dances
  • listening for particular phrases then adding notes to evernote
  • taking pictures when names given
  • if bored, play a youtube video
  • link to mechanical turk to assist with navigation
  • use ifttt to control outcomes

Ultimately however, it took quite a while to get the robot working, and to iterate through possible combinations of connectivity, programming environment and more. Hence we didn’t play with the possbilities as much as anticipated, but we did find a very workable combination of technologies that also – happily – leverages the SmartPhone, which is one of the less complex routes to the ‘Internet of Things’ and to affordable commericalization of a bunch of interesting robotic ideas.

What didn’t work:

  • We found that the USB host connection to Android, and using the USB Host kit was simply too complex for a quick weekend hack. This would make a lot of sense to revisit at time of commercialization, etc.
  • Bluetooth connection to iPhone – we’re both iPhone users, and thought that getting bluetooth connection up would make iPhone control viable. However this was not the case, as Apple has locked down bluetooth connectivity to only ‘blessed’ devices.
  • Easy connectivity to the myrobots.com site by device or Android – however with judicious exploration of scripts for thinkspeak we found some viable options.

What tweaks were needed along the way:

Code fragments that follow are not neat, particualrly documented, nor particularly well written, however they are provided in hopes that they are of use to others exploring these technologies in future.

     

    ____

    SL4A code

    ##############################################################################
    #
    # Drive rover with voice commands from iPhone
    # startBluetooth came from uibtre.py - this also contains code for accelerometer control
    # (http://www.youtube.com/watch?v=BtmRBxRsMk4, http://code.google.com/p/android-jp-kobe/source/browse/trunk/pyAndyUI/uibtre.py)
    #
    # Michael Harries -- March 4, 2012
    ##############################################################################
    import sys
    import time
    import json
    import httplib, urllib
    import android

    d = android.Android()

    def startbluetooth():
        uuid = '00001101-0000-1000-8000-00805F9B34FB'
        d.dialogCreateAlert( "select BlueTooth operation" )
        d.dialogSetPositiveButtonText( "server" )
        d.dialogSetNeutralButtonText( "client" )
        d.dialogSetNegativeButtonText( "no-BT" )
        d.dialogShow()
        ret = d.dialogGetResponse().result[ "which" ]

        if ret == "positive":
            d.bluetoothMakeDiscoverable()
            d.bluetoothAccept( uuid )
            return True
        elif ret == "neutral":
            ret = d.bluetoothConnect( uuid ).result
            if not ret:
                d.makeToast( "bluetooth not connected" )
                sys.exit( 0 )
            return True
        print "skip bt setup"
        return False

    def main():
        prevchar=' '
        nextchar=' '
        # d.startSensingTimed( 2, 300 )   ### sense start

        fBT = startbluetooth()
        d.ttsSpeak("command me baby")
        #d.ttsSpeak("forward backward left right stop cloud quit")
       
        while True:
            command = d.recognizeSpeech("Command me baby", None, None)
            print command[1]
           
            if command[1] == "stop":
                nextchar = 'x'
                fBT and d.bluetoothWrite('x')
                d.ttsSpeak("stop")
            elif command[1] == "left":
                nextchar = 'a'
                fBT and d.bluetoothWrite('a')
                d.ttsSpeak("left")
            elif command[1] == "right":
                nextchar = 'd'
                fBT and d.
    bluetoothWrite('d')            
                d.ttsSpeak("right")
            elif command[1] == "forward":
                nextchar = 'w'
                fBT and d.bluetoothWrite('w')   
                d.ttsSpeak("forward")
            elif command[1] == "back":
                nextchar = 's'
                fBT and d.bluetoothWrite('s')
                d.ttsSpeak("back")
            elif command[1]=="cloud":
                nextchar='c'
                # grab command by myrobot.com
                conn = httplib.HTTPConnection("bots.myrobots.com")
                conn.request("GET", "/channels/595/feed/last.json")
                response = conn.getresponse()
                print response.status, response.reason
                json_string = response.read()
                print json_string
                conn.close()
                data = json.loads(json_string)
                print data
                #say
                say=data['field2']
                if say:
                    d.ttsSpeak(say)
                #move
                move=data['field1']
                if move:
                    print move
                    #d.bluethoothWrite('w')
                    fBT and d.bluetoothWrite(move)
                    time.sleep(1)
                    fBT and d.bluetoothWrite('x')
                #play
                play=data['field3']
                if play:
                    print play
                    d.mediaPlay(play)
                    fBT and d.bluetoothWrite('a')
                    time.sleep(.5)
                    fBT and d.bluetoothWrite('d')
                    time.sleep(.5)
                    fBT and d.bluetoothWrite('w')
                    time.sleep(.5)
                    fBT and d.bluetoothWrite('s')
                    time.sleep(.5)
                    fBT and d.bluetoothWrite('x')
               
            elif command[1] == "quit":
                fBT and d.bluetoothWrite('x')
                d.ttsSpeak("Did I do something wrong")
                return True
               
            #time.sleep( 0.5 )
            prevchar = nextchar
       
        d.dialogDismiss()

    if __name__ == "__main__":
        main()

    ————

     

     

    Arduino – Rover code

    int E1 = 6; //M1 Speed Control
    int E2 = 5; //M2 Speed Control
    int M1 = 8; //M1 Direction Control
    int M2 = 7; //M2 Direction Control
    void setup(void) {
      int i;
      for(i=5;i<=8;i++) pinMode(i, OUTPUT);
      Serial.begin(9600);
    }
    void loop(void) {
      while (Serial.available() < 1) {
      } // Wait until a character is received
      char val = Serial.read();
      int leftspeed = 255; //255 is maximum speed
      int rightspeed = 255;
      switch(val) // Perform an action depending on the command
      {
      case 'w'://Move Forward
        forward (leftspeed,rightspeed);
        break;
      case 's'://Move Backwards
        reverse (leftspeed,rightspeed);
        break;
      case 'a'://Turn Left
        left (leftspeed,rightspeed);
        break;
      case 'd'://Turn Right
        right (leftspeed,rightspeed);
        break;
      case 'x'://stop
        stop ();
        break;
      default:
        stop();
        break;
      }
    }
    void stop(void) //Stop
    {
      digitalWrite(E1,LOW);
      digitalWrite(E2,LOW);
    }
    void forward(char a,char b) {
      analogWrite (E1,a);
      digitalWrite(M1,LOW);
      analogWrite (E2,b);
      digitalWrite(M2,LOW);
    }
    void reverse (char a,char b) {
      analogWrite (E1,a);
      digitalWrite(M1,HIGH);
      analogWrite (E2,b);
      digitalWrite(M2,HIGH);
    }
    void left (char a,char b) {
      analogWrite (E1,a);
      digitalWrite(M1,HIGH);
      analogWrite (E2,b);
      digitalWrite(M2,LOW);
    }
    void right (char a,char b) {
      analogWrite (E1,a);
      digitalWrite(M1,LOW);
      analogWrite (E2,b);
      digitalWrite(M2,HIGH);
    }

    ------------

     

    Python script for uploading commands to myrobots.com

    import httplib, urllib
    import time
     
    def doit():
        params = urllib.urlencode({'field1': 'w', 'field2': 'not quite a square','field3': 'http://www.youtube.com/watch?v=WKxx5QC0ewc#t=57s','key':'B85D18A801134D7F'})
        headers = {"Content-type": "application/x-www-form-urlencoded","Accept": "text/plain"}
        conn = httplib.HTTPConnection("bots.myrobots.com")
        conn.request("POST", "/update", params, headers)
        response = conn.getresponse()
        print response.status, response.reason
        data = response.read()
        conn.close()
     
    #sleep for 16 seconds (api limit of 15 secs)
    if __name__ == "__main__":
            doit()


     

     

     

     

    Posted in Uncategorized | Leave a comment