bloggggg

Home  |  Live  |  Science  |  Lifestyle  |  Entertainment  |  Broadcast  |  Games  |  eBooks  |  Astounds  |  Adbite  |  Cricbell  |  Cyber  |  Idea  |  Digital  |  Privacy  |  Publish  |  ePaper  |  Contact  .Subscribe.Subscribe.Subscribe.Subscribe.Subscribe.Subscribe.Subscribe.Subscribe.Subscribe
Subscribe

Wednesday, 8 July 2020

Researchers use machine learning to teach robots how to trek through unknown terrains

  • A team of Australian researchers has designed a reliable strategy for testing physical abilities of humanoid robots — robots that resemble the human body shape in their build and design. Using a blend of machine learning methods and algorithms, the research team succeeded in enabling test robots to effectively react to unknown changes in the simulated environment, improving their odds of functioning in the real world.
  • The findings, which were published in a joint publication of the IEEE and the Chinese Association of Automation Journal of Automatica Sinica in July, have promising implications in the broad use of humanoid robots in fields such as healthcare, education, disaster response and entertainment.
  • "Humanoid robots have the ability to move around in many ways and thereby imitate human motions to complete complex tasks. In order to be able to do that, their stability is essential, especially under dynamic and unpredictable conditions," said corresponding author Dacheng Tao, Professor and ARC Laureate Fellow in the School of Computer Science and the Faculty of Engineering at the University of Sydney.
  • "We have designed a method that reliably teaches humanoid robots to be able to perform these tasks," added Tao, who is also the Inaugural Director of the UBTECH Sydney Artificial Intelligence Centre.
  • Humanoid robots are robots that resemble humans' physical attributes — the head, a torso, and two arms and feet — and possess the capability to communicate with humans and other robots. Equipped with sensors and other input devices, these robots also perform limited activities according to the outside input.
  • They are typically pre-programmed for specific activities and rely on two kinds of learning methods: model-based and model-free. The former teaches a robot a set of models that it can use to behave in a scenario, while the latter does not. While both learning methods have been successful to a certain extent, each paradigm alone has not proven sufficient to equip a humanoid robot to behave in a real-world scenario where the environment changes constantly and often unpredictably.
  • To overcome this, Tao and his team introduced a new learning structure that incorporates parts of both model-based and model-free learning to balance a biped, or two-legged, robot. The proposed control method bridges the gap between the two learning paradigms, where the transition from learning the model to learning the actual procedure has been smoothly completed. Simulation results show that the proposed algorithm is able to stabilise the robot on a moving platform under unknown rotations. As such, these methods demonstrate that the robots are able to adapt to different unpredictable situations accordingly and can thus be applied to robots outside of the laboratory environment.
  • In the future, the researchers hope to validate their method under more complex environments with more unpredictable and changing variables and with varying dimensions as they test the robots' abilities to exert full body control.
  • "Our ultimate goal will be to see how our method enables the robot to have control over its entire body as it is exposed to unmeasurable and unexpected disturbances such as a changing terrain. We would also like to see the robot's ability to learn how to imitate human motion, such as ankle joint movement, without having been given prior information."Source: https://www.domain-b.com/

Wednesday, 7 March 2018

Burger-flipping robot gets $10M 'bonus'

Miso Robotics, creator of the burger-flipping robot that will soon debut at 50 Caliburger locations, has raised $10 million in a Series B funding round led by Acacia Research Corp. The hefty infusion of cash brings the funding total to $13.1 million, money that Miso Robitics will use to bring "Flippy" to Caliburger, according to a press release.

Notably, the latest investment round included funding from Levy Restaurants, a hospitality company that runs restaurants in high-traffic venues that range from Barclays Center to Walt Disney World Resort. 

Some industry-watchers speculate that robotic technology might ending up replacing restaurant staff, but Miso Robotics stressed in in the press release that Flippy's purpose is simply to take some of the load off kitchen staff. 

"People really like the idea of a kitchen assistant that can really come in and be that third hand for the overworked staff," Miso Robotics CEO Dave Zito said. "They're all reporting high turnover rate and increasing customer demand for fresh ingredients prepared quickly. Trying to keep that at accessible prices is hard."

Flippy features intelligent cooking AI software and provides a camera view of the flat top. Miso Robotics said that Flippy's current restaurant application could have limits in scalability due to finite grill and restaurant seating space. But the company said the robot could assist kitchen staff with to-go orders in busier locations. 

With its latest round of funding, Miso joins a number of well-funded robotics-based food ventures, including Momentum Machines. Backed by $18.4 million in funding, Momentum developed a robot in 2012 that was capable of grilling, assembling and wrapping as many as 400 burgers in an hour. The company has not yet rolled out its robots to the public, however.

The culinary robot space is fast becoming populated with offerings such as Cafe X, a robotic coffee shop that serves coffee from specialty roasters. That company plans to open its third location soon at an undisclosed site, the news release said. Source Link: Burger-flipping robot gets $10M 'bonus':

Sunday, 10 May 2015

Telepresence robot linked up with Oculus Rift

A telepresence robot created by a team of scientists at the University of Pennsylvania in the US has developed a system using Oculus Rift technology to create a hyper-realistic experience for users. Called DORA (Dexterous Observational Roving Automaton), it is designed to give the user the feeling that he/she is in the room. It works by tracking the movements of the head to the robot, which is fitted with Arduino and Intel Edison microcontrollers.
One one of DORA's creators, Emre Tanirgan, describes the experience of using the system as "You feel like you are transported somewhere else in
the  real world as opposed to a simulated environment. You actually get to see people interacting with you, as if you were actually there, and it's  difficult to recreate the same experience in a virtual environment right now." Now that the system is in place and working, the developers are currently refining the system to reduce the lag, which is currently 70  milliseconds. The scientist are aiming to get the lag down to less than 60 milliseconds. As for potential clients, the scientists think the most natural fit for the system will be the emergency services and museums wanting to offer a more immersive virtual tour. Source: InAVate

Sunday, 19 April 2015

World's first robotic kitchen to debut in 2017

The world's first automated kitchen system was unveiled this week at Hanover Messe in Germany – the premier industrial robotics show. Developed by tech firmMoley Robotics, it features a dexterous robot integrated into a kitchen that cooks with the skill and flair of a master chef. The company's goal is to produce a consumer version within two years, supported by an iTunes-style library of recipes that can be downloaded and created by the kitchen. The prototype at the exhibition is the result of two years development and the collaboration of an international team including Sebastian Conran who designed the cooking utensils and Mauro Izzo, DYSEGNO and the Yachtline company, who created the futuristic kitchen furniture. Two complex, fully articulated hands, made by the Shadow Robot Company, comprise the kitchen's key enabling technology. The product of 18 years' research and development, Shadow's products are used in the nuclear industry and by NASA. Able to reproduce the movements of a human hand with astonishing accuracy, their utility underpins the unique capability
of the automated kitchen. The Moley Robotics system works by capturing human skills in motion. Tim Anderson – culinary innovator and winner of the BBC Master Chef competition – played an integral role in the kitchen's development. He first developed a dish that would test the system's capabilities – a crab bisque – and was then 3D recorded at a special studio cooking it. Every motion and nuance was captured, from the way Tim stirred the liquids to the way he controlled the temperature of the hob. His actions were then translated into elegant digital movement, using bespoke algorithms. The robot doesn't just cook like Tim – in terms of skill, technique and execution it is Tim producing the dish. The kitchen even 'signs off' its work with an 'OK' gesture – just as the chef does. "To be honest, I didn't think this was possible," he said. "I chose crab bisque as a dish because it's a real challenge for a human chef to make well – never mind a machine. Having seen – and tasted – the results for myself, I am stunned. This is the beginning of something really significant: a whole new opportunity for producing good food and for people to explore the world's cuisines. It's very exciting." Moley Robotics, headquartered in the UK, is now working to scale the technology ready for mass production and installation in regular-sized
kitchens. Future iterations will be more compact, with smaller control arms but with added functionality in the form of a built-in refrigerator and dishwasher tocomplement a professional-grade hob and oven. The company is working with designers, homebuilders, kitchen installers and food suppliers to promote the system. The mass-market product will be supported by a digital library of over 2,000 dishes when it launches in 2017 and it is envisaged that celebrity chefs will embrace 3D cooking downloads as an appealing addition to the cook book market. Home chefs will be able to upload their favourite recipes too, and so help create the 'iTunes' for food. Moley Robotics was founded by London-based computer scientist, robotics and healthcare innovator Mark Oleynik. The company's aim is to produce technologies that address basic human needs and improve day-to-day quality of life. "Whether you love food and want to explore different cuisines, or fancy saving a favourite family recipe for everyone to enjoy for years to come, the Automated Kitchen can do this," says Oleynik. "It is not just a labour saving device – it is a platform for our creativity. It can even teach us how to become better cooks!" The robotic hands demonstrated this week offer a glimpse of the not-too-distant future, when even greater advances in movement, flexibility, touch and object recognition will have been achieved. Experts believe that near-perfect recreations of human hands, operating in a wide variety of environments, will be possible in just 10 years' time.World's first robotic kitchen to debut in 2017

Wednesday, 23 July 2014

Get ready for Singularity: it’s closer to reality than we think

By: Edie Lush, I HAVE spent a lot of the past week checking out Ray Kurzweil’s world after reading Carole Cadwalladr’s interview with him in The Observer. It is a pretty eye-opening experience. Google’s new director of engineering estimates that computers will gain ‘consciousness’ by 2029 - i.e. when the machines have learned to make their own decisions. Kurzweil
is one of the poster boys of Singularity, defined by Wikipedia as the "moment in time when artificial  intelligence will have progressed to the point of a greater-than-human intelligence”. Associated with this are the concepts of Human 2.0 or Transhumanism, which are when humans begin to augment or replace parts of themselves with robots or computers. Kurzweil has a great deal to say about a future in which we have reverse-engineered the brain - i.e. figured out what every bit of our grey matter does, how and why. According to Kurzweil, "By 2030, reverse-engineering of the human brain will have been completed and non-biological intelligence will merge with our biological brains.” (He wrote this in 2003 by the way.) What does this even mean? Well, it means we’ll have connected our brains to computers. They’ll be able to monitor everything we hear, see and think, plus everything in our email box and answer our questions before we’ve thought of them. Asked recently ‘why would we do this?', Kurzweil responded: “Our search engines will… watch everything we're reading and writing and saying and hearing, and then they'll be like an assistant. It'll say... ‘You were wondering who the actor was in that movie with the robot that can speak six million languages and here she is and here's background about her.’  “Since that helps you through the day, we'll answer your questions before you ask them or even before you realise you have a question, and you'll just get used to this information popping up that you wanted and you'll be frustrated if you're thinking about something and it doesn't immediately pop up without you even having to ask for it.” Does that sound far-fetched? In fact, we’re already connecting our minds and bodies to computers. I recently met Olivier Oullier , Professor of Behavioral and Brain Science at Aix-Marseille University, who uses wearable technology and neurofeedback to train people out of cigarette addiction and obesity. With the help of some special glasses (see picture below) which monitor where you are looking, a brain monitor that senses and records what parts of your brain are being activated and your mobile phone, he can train your brain to better control food or cigarette craving in a video-game like setting. Outlier? No. Scientists in Malta have developed software that allows you to control your music (play, fast forward, turn it up to 11) with your brain. Researchers in Washington connected two people’s brains together - allowing one person to control another’s hand movements by thought. The FDA in the US has granted pre-market approval to Neuropace, a company whose brain implant reduces seizures in epileptic patients by identifying dangerous patterns of brain activity. Meanwhile Braingate has implanted sensors in your brain which allow a woman with tetraplegia to use her thoughts to steer a robot arm to grasp a bottle of coffee and lift it to her lips. The scientists at Proteus combine wearable and ingestible sensors to gather information about medication-taking, activity and rest patterns. Can’t remember if you took your last dose of heart medication? Sensors in the pills will send the information to your phone alerting you to rest easy. Most of us with an iPhone have experienced Siri’s (in)ability to translate your voice into simple commands. In my experience its better at calling my mother than finding directions to a restaurant, but while there are numerous, humorous examples of #Sirifails you can bet each version will be better.  And with life-blogging devices like Narrative which capture your every move in pictures (a "searchable and shareable photographic memory”) you can see how we might not be too far away from Kurzweil’s vision of predictive search. #Scaredyet? Edie Lush tweets at @edielushSource: The Week UK

Saturday, 2 March 2013

First Look: Antarctic Subglacial Lake Explored By Minisub

A video camera on a NASA-designed-and-funded mini-submarine captured this view as it descended a 2,600-foot-deep
Image credit: NASA/JPL-Caltech
A video camera on a NASA-designed-and-funded mini-submarine captured this view as it descended a 2,600-foot-deep (800-meter-deep) borehole to explore Antarctica's subglacial Lake Whillans. The international Whillans Ice Stream Subglacial Access Research Drilling (WISSARD) project was designed to gain insights into subglacial biology, climate history and modern ice sheet behavior. When researcher Alberto Behar from NASA's Jet Propulsion Laboratory in Pasadena, Calif., joined an international Antarctic expedition last month on a trek to investigate a subglacial lake, he brought with him a unique instrument designed and funded by NASA to help the researchers study one of the last unexplored aquatic environments on Earth. First view of the bottom of Antarctic subglacial Lake Whillans, captured by the high-resolution imaging system aboard the Micro-Submersible Lake Exploration Device. The imagery and other data from the mini-sub were used to survey the lake floor and help the WISSARD team verify that the rest of their instruments could be safely deployed into the lake.
Image credit: NASA/JPL-Caltech
Called the Micro-Submersible Lake Exploration Device, the instrument was a small robotic sub about the size and shape of a baseball bat. Designed to expand the range of extreme environments accessible by humans while minimally disturbing the environment, the sub was equipped with hydrological chemical sensors and a high-resolution imaging system. The instruments and cameras characterize the geology, hydrology and chemical characteristics of the sub's surroundings. Behar supervised a team of students from Arizona State University, Tempe, in designing, developing, testing and operating the first-of-its-kind sub. NASA/JPL researcher Alberto Behar joins an international Antarctic expedition to investigate a subglacial lake. This is the first instrument ever to explore a subglacial lake outside of a borehole," Behar said. "It's able to take us places that are inaccessible by any other instruments in existence." The sub was deployed by the U.S. team of the international Whillans Ice Stream Subglacial Access Research Drilling (WISSARD) project. The project's objective was to access subglacial Lake Whillans, located more than 2,000 feet (610 meters) below sea level, deep within West Antarctica's Ross Ice Shelf, nearly 700 miles (about 1,125 kilometers) from the U.S. McMurdo Station. The 20-square-mile (50-square-kilometer) lake is totally devoid of sunlight and has a temperature of 31 degrees Fahrenheit (minus 0.5 degrees Celsius). It is part of a vast Antarctic subglacial aquatic system that covers an area about the size of the continental United States. The WISSARD team included researchers from eight U.S. universities and two collaborating international institutions. They used specialized tools to get clean samples of subglacial lake water and sediments, survey the lake floor with video and characterize the biological, chemical and physical properties of the lake and its surroundings. Their research is designed to gain insights into subglacial biology, climate history and modern ice sheet behavior. The instrument consists of a "mothership" connected to a deployment device that houses the submarine. The sub is designed to operate at depths of up to three-quarters of a mile (1.2 kilometers) and within a range of 0.6 miles (1 kilometer) from the bottom of the borehole that was drilled through the ice to reach the lake. It transmits real-time high-resolution imagery, salinity, temperature and depth measurements to the surface via fiber-optic cables. In a race against time and the elements to access the lake before the end of the current
Antarctic field season, the WISSARD team spent three days in January drilling a 2,600-foot-deep (800-meters), 20-inch-wide (50-centimeters) borehole into the lake, which they reached on Jan. 28. Like Alice down the rabbit hole, the sub was then sent down the borehole, where it was initially used to guide drilling operations. When the instrument finally reached the lake, the team used its imagery to survey the lake floor. The data enabled the team to verify that the rest of the project's instruments could be safely deployed into the lake. The WISSARD team was then able to proceed with its next phase: collecting lake water samples to search for microbial life. And that search has apparently paid off. Earlier this month, the team reported that the lake water did indeed contain living bacteria, a discovery that might hold important implications for the search for life elsewhere in the universe. Core funding for WISSARD and the Micro-Submersible Lake Exploration Device was provided by the National Science Foundation-Office of Polar Programs. The sub was funded by NASA's Cryospheric Sciences and Astrobiology programs. Additional funds for WISSARD instrument development were provided by the National Oceanic and Atmospheric Administration and the Gordon and Betty Moore Foundation. Contacts and sources: Alan Buis, Jet Propulsion Laboratory, For more on WISSARD, visit: http://www.wissard.org . For more on Behar's previous robotic Antarctic research, visit: http://www.nasa.gov/topics/earth/features/antarctic-shrimp.html .Source: Nano Patents And Innovations

Thursday, 28 February 2013

Nexus S from Google: Robotic Charm

The Nexus S is a smartphone co-developed by Google and Samsung and manufactured by Samsung Electronics for release in 2010. It was the first smartphone to use the Android 2.3 "Gingerbread" operating system, and the first Android device to support Near Field Communication (NFC) in both hardware and software.[4] This was the third time that Google worked with a
manufacturer to produce a phone, the first  and second being the GoogleG1 and the Nexus One, both by HTC. Following the Nexus S, the next Android Developer phone was the Galaxy Nexus, released the following year. The Nexus S is produced with four variations. The GT-I9020 (Super AMOLED) and GT-I9023 (Super Clear LCD), each aimed at different markets. The SPH-D720 is the newer 4G version of the phone available in the US. A variant of the GT-I9020, SHW-M200, was released specifically for the Korean market.Source: Wiki, Image: Screen Shot On Video

Wednesday, 27 February 2013

Smart Electric Vehicle Balances on Two Wheels


San Francisco startup Lit Motors has created the C1, a two-wheel, self-balancing electric vehicle that brings the benefits of a motorcycle together with the safety and comfort of a car, according to founder and CTO Danny Kim. After speaking on stage at GreenBiz's Verge conference event just blocks from his Lit Motors Lab, he invited Intel Free Press to his warehouse to talk about the technology his team is building into the C1. To make the C1 affordable, appealing, safe and perform optimally, Kim turned to technologies such as computer aided design, stabilization mechanisms and embedded computer systems tied to sensors functioning somewhat like the sensors found under the hood of Android and Apple smartphones, says Kim. At the core of C1 are two 40-kilowatt electric motors and nestled beneath the driver seat are a set of heavy, fast-spinning gyroscopes, similar to the positioning and orientation technology used in the International Space
Station and many satellites. These gyros put out 1,300-foot pounds of torque, providing enough balancing power "that it would take a baby elephant to knock it over," said Kim. In addition to the frame, body and battery recharging system there's an intricate nervous system spread throughout the vehicle that collects data and returns instructions processed by two Intel Core i7 desktop computer chips. This is what turns the motorcycle into a robot. "There are servos, gyro and traction motors, inertia and infra red sensors, temperature and heat sensors, really a myriad of sensors that all feed data to be processed," said Kim. "Through that process, a command goes to the gryos to tilt and lean the vehicle to keep it balanced or to lean into a turn -- it's all heavily based on the computer processing system."

Saturday, 16 February 2013

From science fiction to science fact: MIT expert achieves invisibility

Scientists have tended to dismiss as impossible the very idea of a device to render something invisible but that failed to dissuade one young academic from looking more closely at how it might be achieved. Mr Janos Perczel was a 22-year old undergraduate at the University of St Andrews in Scotland, when, in August 2011, he published a study describing an ‘invisible sphere’ that slowed down light, potentially allowing the device to remain invisible in front of ever-changing backgrounds of different colours. The Voice of Russia contacted Mr Perczel two years after his revolutionary discovery to ask about his recent projects and his studies at MIT.
Voice of Russia: Janos, before we get into details, how did you come up with the idea of trying to develop an optical devise that would allow things to be hidden things against changing backgrounds? Was there someone who inspired you? Janos Perczel: Science is always a matter of collaborating with other people. There were many people involved in the development of the project. One of them was Professor Ulf Leonhardt who was my supervisor. He was not only my mentor but also my main source of inspiration. There was also Professor Tomas Tyc who helped us with the project. Professor Leonhardt and Professor Tyc have been trying to come up with a solution to the problem of creating a device that would allow one to remain invisible against various backgrounds for some time. After I joined the project, it turned out that the optical sphere that Professors were proposing was operational only in one colour of the spectrum. I started looking into other solutions and it occurred to me that the invisible sphere has to be transmuted in order to slow all the light down to operate in all parts of the colour spectrum. Quite literally, I was having my breakfast in my house when I realised that a transmutation technique would make the optical sphere operational. Voice of Russia: So what is the ‘invisible sphere’? When I think of the device you created I imagine it as somewhat alike the invisibility cloak in the Harry Potter movies. Does it work the same way as science-fiction novelists so often describe? Janos Perczel: There are substantial differences between our sphere and what you see in the Harry Potter movies. Most important of them being that the invisibility cloak in Rowling’s novel is moldable, while our optical device is a rigid sphere-like object. You cannot change its shape or wrap it around yourself. The flexibility of the cloak that you so often see in the movies is incredibly difficult to achieve in reality. Admittedly, there are certain proposals how to do this, especially from St Andrews. Dr Andrea Di Falco has recently come up with the idea of flexible meta-materials which might eventually lead to the creation of Harry-Potter-like cloaks. For now, however, rigid box or sphere-like invisibility devices seem to be more realistic and, indeed, are already being produced in experimental science labs. Editor’s note: meta materials are artificial materials made from large molecules that can be combined to produce exactly the required properties. Voice of Russia: Janos, do you see your sphere being used in real life? When you first came up with the idea, have you ever thought of its practical applications? Janos Perczel: This question is always the hardest one to answer. It is hard to tell how a new device would be applied in practice because you never think about it during the research process. When people like myself develop a new device, we tend to think about the particular scientific problem we are trying to solve and not about how our research will be used in future. We like the feeling of exploring the unexplored. The decision about how the device will be used in practice usually belongs to the engineers, designers, and production managers who know the market well and can foresee what will sell better. Obviously, our device can be used to make things invisible when needed. I am worried about the potential military use of our optical sphere, especially in the area of the development of new invisible weapons. My hope is that the invisible sphere would be used in more peaceful ways. One of them might be shielding people from hazardous forms of radiation. Still, these are just suggestions. Voice of Russia: I know that you finished your invisibility sphere project back in August of 2011. Have you ever thought to continue with your research? Janos Perczel: It would be great to do something that would either be a continuation of our project or something that would relate to meta-materials more generally. Meta-materials can do incredible things. Their power lies in the fact that they can mold the flow of light. One sphere of research where they can be used is perfect-imaging. Professor Leonhardt, for instance, is currently studying how meta-materials can be used to enhance the image-resolution of microscopes. It would be great to get involved in such a project. Voice of Russia: What are you currently working on? Should we wait for yet another ground-breaking project? Janos Perczel: After publishing the paper in August 2011, I spent the next year at Trinity College, University of Cambridge, where I obtained a Master's degree in Mathematical Physics, where I mainly focused on quantum teleportation. Then I moved to the Massachusetts Institute of Technology where I am currently more involved with learning new physics than with research. For a physicist, it is very important to find the right balance between the two. For now, I am still trying to decide what my next project should be. Voice of Russia: My last question would be about your MIT colleagues. I understand that you have not yet spent much time at the Institute, only six months, but during this time have you come across a project that seemed absolutely extraordinary to you? Janos Perczel: Having spent here only half a year it is very hard to get a good overview of research that is happening on campus. You always hear about new robots being developed, computer technologies being advanced, and basic scientific ideas being interpreted from a new angle. One project that I found especially interesting is Professor Marin Soljacic’s research on wireless electricity, which he calls ‘witricity’. To transit electricity from one point to another Professor Soljacic uses magnetic resonance rather than wires. To me, this is not merely a remarkable idea which Professor Soljacic pioneered and made work, but is a revolutionary scientific breakthrough that can potentially transform the way all modern electronic devices work. Source: Voice of RussiaImage

Tuesday, 15 January 2013

Robotic endoscopy will be in place in 3 years: expert

Endo
Research on performing flexible robotic endoscopic surgery is in its final stages, a senior consultant surgeon at the National University of Health System in Singapore has said. Though robotic surgeries on three humans were successful, the technique has to be modified and perfected for use by the medical fraternity without any hassles, Prof. Ho Khek Yu, who performed the first such surgery, said here last night. Ho, here to attend the five day 52nd Annual conference of Indian Society of Gastroenterology, said a flexible endoscope with small robotic arms is inserted in the patient's mouth to the stomach. The surgeon monitors the procedures on a computer and controls the arms with joysticks and button, he said. However, there was the need to modify the robot to perform various other procedures in the digestive tract, Ho said, and claimed that robotic endoscopy would be the future and would be in place in three years. V G Mohanprasad, chairman VGM Hospital and organising secretary of the conference,said treatment cost is going up in India, particularly in speciality fields, one of the reasons being higher import and excise duty levied by the Government Mohanprasad also said the climate in India was also not conducive for manufacturers to invest huge amount, as imported equipment would be cheaper at present. Source: Indian Express

Tuesday, 18 December 2012

Risk of Robot Uprising Wiping Out Human Race to be Studied

robot
LONDON, UK – Cambridge researchers are to assess whether technology could end up destroying human civilisation. The Centre for the Study of Existential Risk (CSER) will study dangers posed by biotechnology, artificial life, nanotechnology and climate change. The scientists said that to dismiss concerns of a potential robot uprising would be "dangerous". Fears that machines may take over have been central to the plot of some of the most popular science fiction films. Perhaps most famous is Skynet, a rogue computer system depicted in the Terminator films. Skynet gained self-awareness and fought back after first being developed by the US military.'Reasonable prediction' But despite being the subject of far-fetched fantasy, researchers said the concept of machines outsmarting us demanded mature attention. "The seriousness of these risks is difficult to assess, but that in itself seems a cause for concern, given how much is at stake," the researchers wrote on a website set up for the centre. The CSER project has been co-founded by Cambridge philosophy professor Huw Price, cosmology and astrophysics professor Martin Rees and Skype co-founder Jaan Tallinn. "It seems a reasonable prediction that some time in this or the next century intelligence will escape from the constraints of biology," Prof Price told the AFP news agency. "What we're trying to do is to push it forward in the respectable scientific community." He added that as robots and computers become smarter than humans, we could find ourselves at the mercy of "machines that are not malicious, but machines whose interests don't include us". Survival of the human race permitting, the centre will launch next year. Source: Korea Times

Sunday, 2 December 2012

Russian unmanned spacecraft docks on second try

(Reuters) - An upgraded Russian unmanned spacecraft successfully linked up with the International Space Station on Sunday on its second attempt to test a new docking system, Russia's space agency said. The docking set aside doubts over the new Kurs-NA rendezvous system that will deliver astronauts and future cargoes to the orbital station after a botched first test when the equipment malfunctioned due to low temperatures earlier this week. The operating system functioned properly after it was allowed to warm up, according to a statement from the U.S. space agency NASA. Kurs-NA is an upgrade of the Kurs docking gear used for years on Russia's manned Soyuz and robotic Progress spacecrafts. The system consolidates five antennas into one, has updated electronics and is designed to improve safety and use less power, according to NASA. The Progress ship re-docked with the Pirs module at 0100 GMT (9 p.m. EDT on Saturday), the Russian space agency Roscomos said in a statement, for a brief final stay before the single-use craft, laden with space station trash, is due to burn up on re-entry over the Pacific Ocean on July 30. Source: The Coming Crisis

Friday, 5 October 2012

Real-life "Contagion" uses DNA to halt outbreak


(Reuters) - If Hollywood needs a plot for a medical thriller, scientists at the National Institutes of Health have one: Doctors, using cutting-edge technology called whole-genome sequencing, trace an outbreak of a deadly bacterial infection, identify precisely how it's spreading - and in the final minutes sic poison-spewing robots on the rampaging microbes. That's essentially what scientists did when Klebsiella pneumoniae, an often-lethal bacterium, spread through NIH's research hospital in Bethesda, Maryland last year, as described in a study published on Wednesday in the journal Science Translational Medicine. "With whole-genome sequencing," said microbial geneticist Julie Segre of NIH's National Human Genome Research Institute, who led the study, "we were able to understand how the outbreak was moving through the hospital and identify weaknesses" in infection-control practices, finally halting the outbreak. *** Source: The Coming Crisis

Friday, 21 September 2012

Digital superwomen! Graphic artist transforms photos of women using Photoshop in these beautiful illusions

Extreme Photo Manipulations by Michael Oswald
By Alex Ward, Chiseled features: The posed photo of Maria Gruner from California has been manipulated to create this beautiful illusion, right, while Ranie Egusquiza from California had her muscles 'enhanced', left, by the digital artist whose work is created entirely on a computer using his digital skills and traditional art knowledge
Photoshop can be a great tool to airbrush away a blemish or dark circles in photos, but in these images it has been used to add tree branches for arms and steel panels for a body. Digital artist Michael Oswald describes his art as ‘photo manipulation on steroids’ and quite rightly as he transforms posed photographs into beautiful illusions. The amazing before and after images include a bikini-clad woman posing with a chisel and hammer a and the result - a figure, chiselling herself out of stone.
'Just tools': Mr Oswald said 'paintbrushes and computers are just tools' in his artworks including an image featuring model Anastasia King
Mr Oswald, 30, who also calls himself Michael O, said on his website: ‘With the exception of the original digital photograph, my work is created entirely on a computer utilising my knowledge of digital techniques and the traditional art skills I learned in my younger days. ‘I consider the 'concept' to be the best part of my work so I put a lot of thought into it and I try not to hold back from expressing myself. When it comes to using the computer rather than paint and brushes, Mr Oswald said in aninterview with Empty Kingdom: ‘I don’t really see a difference in the requirements for a traditional or digital artist.
 
Robotic: Natasha Lazareva from New York City is transformed into a robotic woman complete with a plug for an electric cord
'I believe that 50 per cent of skill in art is a natural born gift. Another 50 per cent is developed with practice. 
‘Paintbrushes and computers are just tools. The standard rules of art, like composition, alwaysapply and the medium is just a personal choice so, everything I learned in basic art classes still applies today.
Skill and practice: Mr Oswald does not see a difference in requirements between a traditional or digital artist because both come down to natural born skill and practice he said
Set in stone: Using the reference photo, bottom left, Mr Oswald created this digital artwork from a photo of model Sara Duncan from California, Source: Travelfwd+

Saturday, 18 August 2012

Now, a ‘robo-dog’ that guides the blind

Robot
Japanese inventors have created a ‘robo-dog’ that can make life a lot easier for blind people. The robotic dog designed by Japanese developer NSK, along with the University of Electro-Communications, uses a Microsoft Kinect image and distance sensor to create a 3-D visualisation of obstacles ahead, the Daily Mail reported.On flat surfaces, the robot can speed up by rolling along on wheels but navigates uneven ground and stairs with its hinged legs. The ‘paws’ contain bumper sensors to also help it avoid obstacles. Source: Indian Express

Monday, 6 August 2012

Nasa's Curiosity rover survives 13,000mph plunge onto Martian surface

By MARK PRIGG: After travelling eight-and-a-half months and 352 million miles, Nasa's rover Curiosity finally landed on Mars at 5.33 GMT (1.33 EDT) this morning. The high-tech craft hit the top of the Martian atmosphere at 13,000mph, and was then slowly lowered by a radical floating 'sky crane' before gently arriving in a massive crater. The news was greeted with cheers and shouts in Nasa's Pasadena Mission Control, and within seconds the craft had sent back the first pictures of its new home. https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhyyikW0y9q3J0F8epu-6BQhGdLtOUlLwqgNJKANTPXLLEja8S4NMfOh2yZc5w3xcfqtsmuJprdQ5lk7QqjbHBfG2PEELT45MrKYNrTlB4IcQqinbVAxOjopcFFQU7_1NssW1pawHCpWaI/s1600/Curiosity+Rover+7.jpg
Jubilant scientists hugged, wept and distributed Mars bars to each other as mission controllers confirmed the landing. 'Touchdown confirmed', controllers said. 'We are wheels down on Mars. Oh, my God.' Nasa Administrator Charles Bolden hailed the success as a big step towards sending men to the red planet. 'Today, the wheels of Curiosity have begun to blaze the trail for human footprints on Mars,' he said. https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjqMjzpDmfPNRCKUbId7fO6FfgUfuHbQCQoI1ZrdGYq0E4W9mrt2EYOOPE_Tw6Se-TEp1gUxztQW_BXHjhu-rSJ0NiPgiM19Ldl0rqdJhyphenhyphenjYkLK5QkxKdrOz6ol0ptjuCoLtwo31LJsCR3P/s1600/NASA_Mars_Rover+dunya+infocom.jpg
The mission was hailed by President Obama, who said 'Tonight, on the planet Mars, the United States of America made history.' The trickiest moment of the landing came in a truly out of this world gymnastics

routine during Curiosity's 'seven minutes of terror' plummet through the atmosphere. When the rover had safely navigated its landing and touched down on the face of the Red Planet, NASA scientists exploded with delight and some even broke down in tears, overwhelmed at the success of thehttps://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgDoNKDQGyRzMGKWp9Vb6xZd3JkFgbAPj9UJg2-EtQIVNHumw4lUi1Fh-ydehHNrOuBWsTKVjLm2MJeXcN-R2g0CQDLEzjeSf3MfSu8oSihw4-tquh8qqSRcKF0L7bpgaoPsQEnknMr788/s1600/Curiosity+Rover+2.jpg
decades-long project. Nasa was ready for the 'Super Bowl of planetary exploration,' said Doug McCuistion, head of the Mars exploration program at NASA headquarters. 'We score and win or we don't score and we don't win,' he said. Curiosity's trajectory was so accurate that engineers decided to wave off a last chance to tweak its position before atmosphere entry. 'We're ready to head in,' said mission manager Brian Portock.https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjNx9dVWCcGYQu_xHdZem_vioEBrrYTW_z_t_yCcodWh0q-VhVUVNM6WlhhciQv69YFu_8z5Oc0yYMZlkTFbTcnt6PFjo8v_zHfO5I6tD_2_69FKanRw8dvyWwik0BMIXaK4kF9F-zqt7k/s1600/Curiosity+Rover+3.jpg
Not ones to tempt fate, flight controllers broke out the 'good luck' peanuts before Curiosity takes the plunge as part of a long-running tradition. One scientist who could relate to the building anxiety was Cornell University planetary scientist Steve Squyres, who headed NASA's last successful rover mission in 2004.This time around, Squyres has a supporting role and planned to view the landing with other researchers in the 'science bullpen.' 'Landing on Mars is always a nerve-racking thing. You're never going to get relaxed about something like landing a spacecraft on Mars,' he said. Sunday's touchdown attempt was especially intense because NASA is testing a brand new landing technique. Due to the communication delay between Mars and Earth, Curiosity was be on autopilot. There was also extra pressure because budget woes have forced NASA to rejigger its Mars exploration roadmap. 'There's nothing in the pipeline' beyond the planned launch of a Mars orbiter in 2013, said former NASA Mars czar Scott Hubbard, who teaches at Stanford University. Curiosity was launched to study whether the Martian environment ever had conditions suitable for microbial life. The last Mars rovers, twins Spirit and Opportunity, were cocooned in air bags and bounced to a stop in 2004. The plans for Curiosity called for a series of braking tricks, similar to those used by the space shuttle, and a supersonic parachute to slow it down. Next: ditch the heat shield used for the fiery descent. And in a new twist, engineers came up with a way to lower the rover by cable from a hovering rocket-powered backpack. At touchdown, the cords cut and the rocket stage crashes a distance away. The nuclear-powered Curiosity, the size of a small car, is packed with scientific tools, cameras and a weather station. It sports a robotic arm with a power drill, a laser that can zap distant rocks, a chemistry lab to sniff for the chemical building blocks of life and a detector to measure dangerous radiation on the surface. It also tracked radiation levels during the journey to help NASA better understand the risks astronauts could face on a future manned trip. After several weeks of health checkups, the six-wheeled rover could take its first short drive and flex its robotic arm. Source: Travelfwd+

Thursday, 2 August 2012

Human Immortality by 2045?

Those of us young enough to hang on for another 33 years may be able to extend those three decades into eternity. Of course, you'd have to be okay with living out the rest of your endless days as something better suited to a James Cameron movie. But if transforming into a cybernetic humanoid robot sounds like the thing for you, Russian entrepreneur Dmitry Itskov's 2045 Initiative could use a little help from your personal chequing account in order to fund his dream of humans becoming immortal. The 31-year-old claims he has assembled a team of top scientists to work on the initiative — a six-stage project that would ultimately see our brains housed in a fully functional holographic human avatar by the year 2045. Itskov claims his idea will "free" the majority of people on the planet from "disease, old age and even death" through advanced neuroscience, nanotechnology and android robotics. Source: Article

Sunday, 22 July 2012

KIMS performs the first robot assisted surgery in India


Hyderabad: The Krishna Institute of Medical Sciences (KIMS) has performed a unique surgery using robotic technology which is believed to be the first of its kind in the whole India. Dr Bhaskar Managing Director and CEO of the Krishna Institute of Medical Sciences (KIMS) said "Robotic surgeries have marked a new era in the medical sector. On the occasion the KIMS doctors’ team told that robotic surgeries are used for operating upon large intestine. "There is lesser blood loss, minimal complications, less scarring, shorter hospital stay and quick return to work, hence there is less expenses involved in the hospital stay," said a KIMS doctor. Source: SiasatSource: Image

Tuesday, 8 May 2012

3D first for surgeons at Manchester Royal Infirmary

InAVateSurgeons at the Manchester Royal Infirmary (MRI) in the UK have become the first to use 3D technology in a prostate cancer operation, during what is hoped to be the first of many proceedures using this technique. 62 year old John Green has undergone a 3D hand-held robotically assisted laparoscopic radical prostatectomy; meaning the prostate which contains a small cancer will be completely removed. However, for the first time in the UK, handheld robot technology has been used in conjunction with Normally, keyhole surgery is performed with the surgical team using a 2D screen to see inside the body, but because it is a 2D image, there are limitations on depth perception. This means that surgeons have to use their experience and skill to be able to interpret shadows on the screen. The MRI team has used a HD 3D camera system which delivers a 3D image on the operating monitor. All of the surgeons and theatre staff wore 3D glasses to enable them to see the image. The combination of these two new technologies will allow for more surgical mobility with keyhole techniques and greater vision, whilst maintaining the tactile feedback currently gained with laparoscopic surgery but that you don't get with current robotic instruments. The hope is that the 3D image will allow for precision and better results for patients; all at a fraction of the costs of current robotic and imaging techniques. The team plan to trial the technology on a small number of patients to see how effective it is. Dan Burke, the Consultant Urologist who is leading on today's surgery said: "If the results are as great as we are expecting, we will look to see how we can raise the money to buy the equipment, allowing us to offer it to many more patients. We are already excited at the potential this technology has, not just for us but for our many colleagues in the Trust in perform keyhole surgery. The equipment can be moved easily between theatres so any specialty could benefit. Ultimately we are aiming for a better patient outcome at a cost that will benefit the NHS."  Source: InAVate

Thursday, 12 April 2012

A living robot who smiles, sings like a girl


The New Indian Express, London, IANS: A Japanese company has created a robot with 65 facial expressions which can talk, sing and can easily be mistaken for a real-life girl at first sight. The Geminoid F can create smiles and even enigmatic, quizzical expressions, using mechanical actuators underneath her rubber 'skin', the Daily Mail reported. She can smile, furrow her brows and move her mouth - although she often looks rather dazed. It can also talk and sing - playing recordings, or 'mouthing' other people's voices. Geminoid F has been produced by Hiroshi Ishiguro, a renowned robot designer at Osaka University in western Japan, whose androids (designed to resemble a human) come with a steep $1.2 million price tag. But Geminoid F is cheaper -- $110,000 -- which Ishiguro hopes may take the technology closer to the mainstream. Her creator says his goal is to create a robot that can fool people into believing it is a human being. Geminoid TMF is equipped with 12 motorised actuators, powered by air pressure, which allow her to 'copy' human facial expressions. Ishiguro, who has designed several robots made to look like humans in the past - even building one in his own image, says the day is not far when robots could fool us into believing they are human. "Please define 'what is human', and we will make a copy," he says. Source: The New Indian Express