Fortune at the bottom of the pyramid – too good to be true?

I’m a bit behind the story here, but just came across this paper (from 2007) on the Kauffman.org site claiming that selling to the bottom of the pyramid isn’t a win (in most cases). Worth a read if, like me, you’re behind the story on this one …

The popular ‘bottom of the pyramid’ (BOP) proposition argues that large companies can make a fortune by selling to poor people and simultaneously help eradicate poverty. While a few market opportunities do exist, the market at the BOP is generally too small monetarily to be very profitable for most multinationals. At the same time, the private sector can play a key role in poverty alleviation by viewing the poor as producers, and emphasize buying from them, rather than selling to them.”


Fortune at the Bottom of the Pyramid: A Mirage How the private sector can help alleviate poverty

Aneel Karnani Stephen M. Ross School of Business at the University of Michigan

E-mail: akarnani@umich.edu

April 2007

http://papers.ssrn.com/sol3/papers.cfm?abstract_id=914518 

 

The Free Universal Construction Kit | F.A.T.

Kit_6469_universal_50q

F.A.T. Lab and Sy-Lab are pleased to present the Free Universal Construction Kit: a matrix of nearly 80 adapter bricks that enable complete interoperability between ten* popular children’s construction toys. By allowing any piece to join to any other, the Kit encourages totally new forms of intercourse between otherwise closed systems—enabling radically hybrid constructive play, the creation of previously impossible designs, and ultimately, more creative opportunities for kids. As with other grassroots interoperability remedies, the Free Universal Construction Kit implements proprietary protocols in order to provide a public service unmet—or unmeetable—by corporate interests.

By itself this is a massively cool exercise, but brings up all sorts of interesting philosophical questions about using technologies of all stripes, together, in shocking mix and match rainbows of competence and otherwise. It’s not so far from innovation in the large, the idea that bringing together different domains of expertise results in all sorts of interesting new ideas, or from the tremendous enthusiasm for startups and indeed any type of ‘making’ that happens again and again in SV and in special parts of the world.

In this case, combining toys with different construction ‘affordances’ makes for a much more flexible, more universal, construction capability. It’s also super cool in that it makes the capabilities of 3d printing ever more clear in terms of bridging into a new world of IP concerns, capabilities and freedoms.

For more on the Universal construction kit see http://fffff.at/free-universal-construction-kit/
For an interesting riff on makers and the emerging world of 3d see http://venturebeat.com/2012/03/21/dylans-desk-when-craftsmanship-meets-tech-m…

Cloud Robotics Hackathon – Notes and thoughts

Last weekend Andra organized a very cool robotics hackathon at the Citrix Startup Accelerator as part of the Cloud Robotics Hackathon. Apart from being ground breaking, it was a heck of a lot of fun. Here’s some images and videos of the event on the Google+ for RobotLaunchpad.

In this blog I wanted to capture the approach that Guy Bieber and I came down to with the supplied RobotShop Rover with ultimately a bluetooth module and Andra’s HTC Android phone. This is intended to be a quick capture of approach and code to help others who use similar kits.

Photo-2

This was a fun weekend and way to explore the near future of these types of devices and using them with cloud functionality. I’m looking forward to much more from commodity robots linked in with SmartPhones and the cloud. This was a good way to make the Internet of Things more visceral.

__

The goal of the exercise was to build a useful tool with a combination of the rover and the myrobots.com site for robot connectivity. We had discussed a bunch of possible outcomes including:

  • A robot that responds to (online visible) sports outcomes by dancing happy or sad dances
  • listening for particular phrases then adding notes to evernote
  • taking pictures when names given
  • if bored, play a youtube video
  • link to mechanical turk to assist with navigation
  • use ifttt to control outcomes

Ultimately however, it took quite a while to get the robot working, and to iterate through possible combinations of connectivity, programming environment and more. Hence we didn’t play with the possbilities as much as anticipated, but we did find a very workable combination of technologies that also – happily – leverages the SmartPhone, which is one of the less complex routes to the ‘Internet of Things’ and to affordable commericalization of a bunch of interesting robotic ideas.

What didn’t work:

  • We found that the USB host connection to Android, and using the USB Host kit was simply too complex for a quick weekend hack. This would make a lot of sense to revisit at time of commercialization, etc.
  • Bluetooth connection to iPhone – we’re both iPhone users, and thought that getting bluetooth connection up would make iPhone control viable. However this was not the case, as Apple has locked down bluetooth connectivity to only ‘blessed’ devices.
  • Easy connectivity to the myrobots.com site by device or Android – however with judicious exploration of scripts for thinkspeak we found some viable options.

What tweaks were needed along the way:

Code fragments that follow are not neat, particualrly documented, nor particularly well written, however they are provided in hopes that they are of use to others exploring these technologies in future.

     

    ____

    SL4A code

    ##############################################################################
    #
    # Drive rover with voice commands from iPhone
    # startBluetooth came from uibtre.py - this also contains code for accelerometer control
    # (http://www.youtube.com/watch?v=BtmRBxRsMk4, http://code.google.com/p/android-jp-kobe/source/browse/trunk/pyAndyUI/uibtre.py)
    #
    # Michael Harries -- March 4, 2012
    ##############################################################################
    import sys
    import time
    import json
    import httplib, urllib
    import android

    d = android.Android()

    def startbluetooth():
        uuid = '00001101-0000-1000-8000-00805F9B34FB'
        d.dialogCreateAlert( "select BlueTooth operation" )
        d.dialogSetPositiveButtonText( "server" )
        d.dialogSetNeutralButtonText( "client" )
        d.dialogSetNegativeButtonText( "no-BT" )
        d.dialogShow()
        ret = d.dialogGetResponse().result[ "which" ]

        if ret == "positive":
            d.bluetoothMakeDiscoverable()
            d.bluetoothAccept( uuid )
            return True
        elif ret == "neutral":
            ret = d.bluetoothConnect( uuid ).result
            if not ret:
                d.makeToast( "bluetooth not connected" )
                sys.exit( 0 )
            return True
        print "skip bt setup"
        return False

    def main():
        prevchar=' '
        nextchar=' '
        # d.startSensingTimed( 2, 300 )   ### sense start

        fBT = startbluetooth()
        d.ttsSpeak("command me baby")
        #d.ttsSpeak("forward backward left right stop cloud quit")
       
        while True:
            command = d.recognizeSpeech("Command me baby", None, None)
            print command[1]
           
            if command[1] == "stop":
                nextchar = 'x'
                fBT and d.bluetoothWrite('x')
                d.ttsSpeak("stop")
            elif command[1] == "left":
                nextchar = 'a'
                fBT and d.bluetoothWrite('a')
                d.ttsSpeak("left")
            elif command[1] == "right":
                nextchar = 'd'
                fBT and d.
    bluetoothWrite('d')            
                d.ttsSpeak("right")
            elif command[1] == "forward":
                nextchar = 'w'
                fBT and d.bluetoothWrite('w')   
                d.ttsSpeak("forward")
            elif command[1] == "back":
                nextchar = 's'
                fBT and d.bluetoothWrite('s')
                d.ttsSpeak("back")
            elif command[1]=="cloud":
                nextchar='c'
                # grab command by myrobot.com
                conn = httplib.HTTPConnection("bots.myrobots.com")
                conn.request("GET", "/channels/595/feed/last.json")
                response = conn.getresponse()
                print response.status, response.reason
                json_string = response.read()
                print json_string
                conn.close()
                data = json.loads(json_string)
                print data
                #say
                say=data['field2']
                if say:
                    d.ttsSpeak(say)
                #move
                move=data['field1']
                if move:
                    print move
                    #d.bluethoothWrite('w')
                    fBT and d.bluetoothWrite(move)
                    time.sleep(1)
                    fBT and d.bluetoothWrite('x')
                #play
                play=data['field3']
                if play:
                    print play
                    d.mediaPlay(play)
                    fBT and d.bluetoothWrite('a')
                    time.sleep(.5)
                    fBT and d.bluetoothWrite('d')
                    time.sleep(.5)
                    fBT and d.bluetoothWrite('w')
                    time.sleep(.5)
                    fBT and d.bluetoothWrite('s')
                    time.sleep(.5)
                    fBT and d.bluetoothWrite('x')
               
            elif command[1] == "quit":
                fBT and d.bluetoothWrite('x')
                d.ttsSpeak("Did I do something wrong")
                return True
               
            #time.sleep( 0.5 )
            prevchar = nextchar
       
        d.dialogDismiss()

    if __name__ == "__main__":
        main()

    ————

     

     

    Arduino – Rover code

    int E1 = 6; //M1 Speed Control
    int E2 = 5; //M2 Speed Control
    int M1 = 8; //M1 Direction Control
    int M2 = 7; //M2 Direction Control
    void setup(void) {
      int i;
      for(i=5;i<=8;i++) pinMode(i, OUTPUT);
      Serial.begin(9600);
    }
    void loop(void) {
      while (Serial.available() < 1) {
      } // Wait until a character is received
      char val = Serial.read();
      int leftspeed = 255; //255 is maximum speed
      int rightspeed = 255;
      switch(val) // Perform an action depending on the command
      {
      case 'w'://Move Forward
        forward (leftspeed,rightspeed);
        break;
      case 's'://Move Backwards
        reverse (leftspeed,rightspeed);
        break;
      case 'a'://Turn Left
        left (leftspeed,rightspeed);
        break;
      case 'd'://Turn Right
        right (leftspeed,rightspeed);
        break;
      case 'x'://stop
        stop ();
        break;
      default:
        stop();
        break;
      }
    }
    void stop(void) //Stop
    {
      digitalWrite(E1,LOW);
      digitalWrite(E2,LOW);
    }
    void forward(char a,char b) {
      analogWrite (E1,a);
      digitalWrite(M1,LOW);
      analogWrite (E2,b);
      digitalWrite(M2,LOW);
    }
    void reverse (char a,char b) {
      analogWrite (E1,a);
      digitalWrite(M1,HIGH);
      analogWrite (E2,b);
      digitalWrite(M2,HIGH);
    }
    void left (char a,char b) {
      analogWrite (E1,a);
      digitalWrite(M1,HIGH);
      analogWrite (E2,b);
      digitalWrite(M2,LOW);
    }
    void right (char a,char b) {
      analogWrite (E1,a);
      digitalWrite(M1,LOW);
      analogWrite (E2,b);
      digitalWrite(M2,HIGH);
    }

    ------------

     

    Python script for uploading commands to myrobots.com

    import httplib, urllib
    import time
     
    def doit():
        params = urllib.urlencode({'field1': 'w', 'field2': 'not quite a square','field3': 'http://www.youtube.com/watch?v=WKxx5QC0ewc#t=57s','key':'B85D18A801134D7F'})
        headers = {"Content-type": "application/x-www-form-urlencoded","Accept": "text/plain"}
        conn = httplib.HTTPConnection("bots.myrobots.com")
        conn.request("POST", "/update", params, headers)
        response = conn.getresponse()
        print response.status, response.reason
        data = response.read()
        conn.close()
     
    #sleep for 16 seconds (api limit of 15 secs)
    if __name__ == "__main__":
            doit()


     

     

     

     

    Still think 3d printing is only for geeks? Here’s a great primer

    Way too many people still think 3d printing is an anomoly, a waste of time, a hobby. I would tender the view that this is the response of dinosaurs viewing the mammal, or of “serious computing” specialists considered the early micro computer enthusiasts.

    In any case, it was a pleasure to come across this TED talk by Lisa Harouni that gives a great update of all the different elements of 3d printing as it stands today, and points to the rapidly emerging personal manufacturing future.

    The 6 killer apps of prosperity – financial crises an epiphenomena – TED talk

    This TED talk provides a meta-pattern to understand the rise of China/India, etc as being associated with the East accumulating some of the ‘killer apps’ or patterns from the West, and thus regaining their historical productivity balance with the West. It’s the great re-convergence.

    At the end of the talk Niall Ferguson makes the case that the current Western fiscal crises are mainly epiphenomenon and have been accelerated by the underlying shift or rebalancing of power.

    The universal bot – with SmartPhone

    back3/9next

    –>

    The Xybot turns an ordinary iPhone into a mobile avatar. A phone docked into the robot streams video of the person controlling it using an app running on another iPhone.

    Xybotyx of Littleton, Colorado, which makes the robot, was founded by two engineers who met while working on NASA’s Phoenix Mars rover. The Xybot will be released in March at a price of $111.11.

    Helluva lot cheaper than a presence bot. Another example of the ubiquity of the SmartPhone making computing / networking / etc effectively free.

    Cartels Are an Emergent Phenomenon, Say Complexity Theorists – Technology Review

    Newtons_cradle_animation_new

    … in a population of a million agents over time period of a billion iterations and more.

    … It turns out that a crucial factor is the speed at which buyers and sellers react to the market. When buyers react quickest, sellers are forced to match the best possible value for money and prices tend to drop.

    By contrast, when sellers react quickest, they are quick to copy others offering poor value for money. This reduces the number of sellers offering good value for money in a vicious cycle that drives prices as high as possible. 

    This is the emergence of a cartel and it happens in these guys’ model without any collusion between sellers. Instead, it is an emergent property of the market place that happens when the sellers outperform buyers in the way they react to market conditions. 

    Cool result from computational economics. I wonder if similar things occur throughout the dynamics of new product adoption.

    Insights into the future of Computer Interaction and todays blithe assumptions

    take out your favorite Magical And Revolutionary Technology Device. Use it for a bit.

    What did you feel? Did it feel glassy? Did it have no connection whatsoever with the task you were performing?

    I call this technology Pictures Under Glass. Pictures Under Glass sacrifice all the tactile richness of working with our hands, offering instead a hokey visual facade.

    Is that so bad, to dump the tactile for the visual? Try this: close your eyes and tie your shoelaces. No problem at all, right? Now, how well do you think you could tie your shoes if your arm was asleep? Or even if your fingers were numb? When working with our hands, touch does the driving, and vision helps out from the back seat.

    Pictures Under Glass is an interaction paradigm of permanent numbness. It’s a Novocaine drip to the wrist. It denies our hands what they do best. And yet, it’s the star player in every Vision Of The Future.

    To me, claiming that Pictures Under Glass is the future of interaction is like claiming that black-and-white is the future of photography. It’s obviously a transitional technology. And the sooner we transition, the better.

    HANDS MANIPULATE THINGS

    Excellent points on limitations of our current magical devices. Highly recommend that you read the original rant from Bret Victor. http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesign/

    The reality is that there is no more embodied approach for devices extant as yet, with possible exceptions of robotics homebrew/arduino/etc. However these certainly don’t have anything like the levels of ‘magic’ interaction that we’re enjoying today with the ‘pictures under glass’.

    I’m not holding my breath for new approaches any time soon, but longer term …

    BuildAR – making Augmented Reality curation easy …

    Screen_shot_2011-09-10_at_11

    What will you build?

    Augmented Reality (AR) overlays information, images, 3D objects, audio and video onto your view of the real world around you.

    Create your own mobile AR projects easily with no development required & link your content to the real world!

    Steps to making a technology ubiquitous and accessible – grow to simplicity & hide the underlying complexity. BuildAR gets it.

    Looking forward to seeing more from @buildAR