像人类一样快速拣货的机器人Swift

felix
felix 这家伙很懒,还没有设置简介

0 人点赞了该文章 · 1072 浏览

IAM Robotics Takes on Automated Warehouse Picking A Pittsburgh startup wants to build a robot that can pick items off of shelves, the holy grail of warehouse fulfillmentBy Evan Ackerman
dt.common_.streams_.StreamServer_.jpg
Photo: IAM Robotics There’s a small but growing handful of robotics companies trying to make it in the warehouse market with systems that work with humans on order fulfillment. Generally, we’re talking about clever wheeled platforms that can autonomously deliver goods from one place to another, while humans continue do the most challenging part: picking items off of shelves. There’s a lot of value here, since using robots to move stuff frees up humans to spend more of their time picking. Ideally, however, you’d have the robot doing the picking as well, but this is a very difficult problem in terms of sensing, motion planning, and manipulation. And getting a robot do pick reliably at a speed that could make it a viable human replacement is more difficult still. IAM Robotics, a startup based in Pittsburgh, Pa., is one of the first companies to take on the picking problem on a commercial level. Founded in 2012, they’ve developed an autonomous mobile picking robots called Swift that consists of a wheeled base and a Fanuc arm with a 15-lb payload and suction gripper that can reach 18 inches back into shelves. A height-adjustable carriage can access shelves between 3 and 85 inches, and an autonomously swappable tote carries up to 50 pounds of stuff. According to the company, the robot can autonomously navigate around warehouses and is “capable of picking at human-level speeds” of 200 picks per hour.(达到人类的拣货水平,每小时200件) We spoke with IAM Robotics founder and CEO Tom Galluzzo to find out how they’re making this happen. Prior to IAM Robotics, Galluzzo worked developing autonomous systems for Carnegie Mellon, Harris Corporation, the Air Force Research Laboratory, Boeing, and others organizations. He told us that his experience building robust real-world robotic systems had a big influence in the development of Swift. As an example, he said that the robot, instead of dynamically calculating the best way to pick every item that it runs across, queries a database that consists of items that have already been scanned in 3D, modeled, and analyzed to figure out the best possible grasping strategies.  IAM Robotics is currently conducting a pilot project with Rochester Drug Cooperative, one of the largest healthcare distributors in the United States. RDC is testings the Swift robots along with inventory tracking technology and fleet management software also developed by the IAM. Here’s the rest of our interview with Galluzzo. IEEE Spectrum: IAM Robotics is working on autonomously picking items off of shelves, which is something that most warehouse fulfillment companies aren’t trying to do yet, because it’s a really hard problem. How’d you decide to start there? Tom Galluzzo: I had been working at Carnegie Mellon’s National Robotics Engineering Center as a robotics researcher, and we were working on a project for DARPA called ARM-S, which was a predecessor to the DARPA Robotics Challenge. Basically, what DARPA wanted us to do was have an autonomous manipulation robot manipulate everyday objects. So we were doing things autonomously like finding objects on a table, picking them up, moving them around…we tried to do really challenging things like changing a tire, assembling parts, all kinds of stuff. We got quite good at just general object “find it, pick it up and move it.” When we did that, we started looking for low-hanging fruit in industry, and we were led on a natural trajectory to picking in warehouses. This is a pretty challenging problem, we felt confident that we could bring value and that manipulation was really the key differentiator there, being able to see and manipulate those products.  Can you put into context for us why other companies aren’t doing this yet? Why is it so challenging? The thing you have to realize is that with technology being in the state it’s at today, no one is going to solve all of picking right now. We can’t get a general machine to do it all; you have to pick and choose which problems you’re going to solve. We started focusing on some lower hanging fruit, which includes consumer products, things in boxes, things in bottles, pharmaceuticals. Initially we chose the suction cup for speed because the number one thing in this industry is we have be as productive as a person. Everything that we had, even all the academic stuff at CMU, was way too slow, and we kind of had to start from scratch. So that’s what we did. When we started doing this, we wrote our whole perception pipeline from scratch, and tried to get the robots to just pick things as fast as they could, in a semi-realistic scenario. When we did that we surprised ourselves with what we could pick with the suction cup: it was more than what we expected. And, we could also pick it really, really fast. With a robot, it’s really the perception speed and the speed of the arm. And what we solved was the perception speed– we were able see these products really fast and the arm speed became the bottleneck. A [human] pick worker on a sustained basis, really can’t pick more than 600 products per hour, and that’s without traveling much. I think we’ve done a demo of 1,100 products per hour with the robot sitting next to a shelf. We were very encouraged by that, and the fact that our robot has certain applications that work, and we can’t do it all, we’re fine with that. If we can knock down a couple of key applications, we have time to expand our breadth in terms of what we can pick. [img]
Mjc4OTEwMw.jpeg
Photo: IAM Robotics Rather than dynamically calculating the best way to pick every item that it runs across, Swift instead queries a database that consists of items that have already been scanned in 3D, modeled, and analyzed to figure out the best possible grasping strategies.What kind of constraints are there on your picking system? It has to be fairly organized. The products have to back one another up, because the way we’re picking them, we’re pushing on them from the front, and if you have light products that don’t have anything behind them, they’re going to tip over. The majority of the work is done manually on the put away, where people are going to go on to make things a little bit more organized, but you’re saving five times what you spend in addition to your current workforce by automating the picking side. The economics of picking are such that, when you’re putting stuff away, you walk to a location one time with a case of product. But when you pick it, you have to walk back ten or twenty times to pick all that product. There’s a lot more labor on the picking side. So we figure, let’s automate that. The benefit to the customer is that they get very efficient picking for both humans and robots, people get an easier job, and you get a very very well-organized warehouse. Can you talk about what kind of hardware and software you’re using for perception, since that seems like a big part of the challenge here? Yeah, it is. I can’t disclose too much since that’s kind of like our secret sauce: the perception pipeline that we’ve built on top of off-the-shelf color cameras and depth sensors. What we’ve done is we’ve layered on what we call our rapid vision algorithms that process the data extremely fast, extracts the information that we need to know know about the pick, and matches that that has already been visually trained for the system. [The visual training comes from] taking our rapid vision system and packaging it into a standalone little photo studio scanner that we call Flash. It’s just a standalone miniature photo booth, and we walk up and down each aisle one time, barcode scan each product, and put a sample of the product in the scanner. Then we collect images, we weight the product and see how heavy it is, we collect dimensions, surface features, those kind of things.  Okay, so you do all of this data collection in advance, so that the robot is already trained on how to pick each item? Yes, absolutely. We hook the up to the customer’s warehouse management system, and we marry that up with the data we have from Flash to send down a visual pick list to the robot.  It sounds picking relatively small, boxy things is your target market. Is there potential to go into more general e-commerce? I imagine that that might be a little more difficult for picking.  Absolutely. We have two models that allow us to go into those general applications. One is doing things like bin retrieval, almost like a light version of a Kiva system. Instead of bringing over a whole shelf, we just retrieve a box of whatever the product is. I also think there’s going to be lots of amazing applications in grocery e-commerce.  And how would you have to change your system in order to address that market?  We wouldn’t be able to pick produce or anything like that on day 1. I would say 75 percent of the stuff in the grocery store is pickable with a suction cup, but we just need more breadth in terms of size and shape and weight capability. So not just one suction cup, but an array, enabling us to pick lots of different sizes. Beyond that, it’s all good.  It sounds like there are plenty of good reasons to use suction grippers, but over the last couple of years there’s been a lot of innovation in that space. Have you thought about trying other gripper designs?  There’s Grabit, there’s Empire Robotics... I think they all have their niches. Robotics is a very physics-constrained world, and things have to fit very precisely in size, weight, and power. That’s just reality of what one type of machine is going to be able to grab and pick up as opposed to another. So to a certain extent I think we’re going to see this granularity and differentiation in grippers and arms and we’re going to need to be able to support all of those to fill all the niches.  Humans do well with just one kind of gripper. Do you think that a more anthropomorphic design might be a realistic approach for generalized picking, or do you think it’s going to be one of these other designs that are less capable but more forgiving?  I don’t think in the short term it is a realistic approach. We used anthropomorphic hands on RMS, and I’m not sold on what their exact niche is just yet. There are lots of good reasons why suction cups and two-finger grippers have been the prevalent gripping technology for the automation industry for decades. Until we see a significant step-up in software and even some hardware design... Some of this, to a certain extent, I think we want to do at IAM Robotics, but it’s just not quite there yet, and we just want to get some killer applications under our belt before we expand. IAM Robotics was at Modex this year, where Tom says they met with something like 150 different organizations who were interested in their technology. Not bad for a small robotics company’s first trade show ever. Over the next six months or so, IAM Robotics will be focusing on building up their successful initial deployment with Rochester Drug, and starting to schedule additional pilot projects with other customers. If you want in, they’re currently taking orders for late 2016.
Mjc4OTEyMw.jpeg
[ IAM Robotics ]

发布于 2018-05-12 21:14

免责声明:

本文由 felix 原创发布于 大董知识库 ,著作权归作者所有。

登录一下,更多精彩内容等你发现,贡献精彩回答,参与评论互动

登录! 还没有账号?去注册

felix
2018-05-12 21:41
(googel 翻译) 有一小部分机器人公司正在试图在仓库市场上推出系统,这些系统可以与人类一起完成订单。通常,我们谈论的是智能轮式平台,可以自动将货物从一个地方运送到另一个地方,而人类继续完成最具挑战性的部分:从货架上取货。这里有很多价值,因为使用机器人移动东西可以让人们花更多的时间来挑选。 但是,理想情况下,您还可以让机器人进行拾取,但这在传感,运动规划和操作方面是一个非常棘手的问题。而让一个机器人确实以一种可能使其成为可行的人类替代品的速度可靠地挑选更加困难。 位于宾夕法尼亚州匹兹堡的初创公司IAM Robotics 是首批在商业层面上采用挑选问题的公司之一。该公司成立于2012年,开发出一种名为 Swift 的 自动移动机器人,它由一个轮式基座和一个带有15磅有效载荷和一个可以达到18英寸重量的吸盘夹具的Fanuc手臂组成 。可调节高度的托架可以接触3到85英寸之间的货架,而自动交换的托运货物可承载50磅重物。据该公司称,该机器人可以自动导航仓库周围,并且 “能够以每小时200次拣选的人体速度进行拣选”。 我们与IAM机器人公司创始人兼首席执行官Tom Galluzzo交谈,了解他们如何实现这一目标。 在加入IAM机器人之前,Galluzzo曾为卡内基梅隆,哈里斯公司,空军研究实验室,波音公司和其他组织开发自主系统。他告诉我们,他建立稳健的现实世界机器人系统的经验对Swift的发展有很大的影响。作为一个例子,他说机器人不是动态计算选择它所运行的每个项目的最佳方式,而是查询一个数据库,该数据库包含已经以3D扫描的项目,建模和分析以找出最佳可能的抓握策略。 IAM机器人公司目前正在与美国最大的医疗保健分销商之一罗彻斯特药物合作社(Rochester Drug Cooperative)进行一项 试点项目。RDC正在测试Swift机器人以及由IAM开发的库存跟踪技术和车队管理软件。以下是我们接受Galluzzo采访的其余部分。 IEEE Spectrum: IAM Robotics正在自动从货架上拣货物,这是大多数仓库履行公司都不想做的事情,因为这是一个非常难的问题。你怎么决定从那里开始? Tom Galluzzo: 我曾在卡内基梅隆国家机器人工程中心担任机器人研究员,我们正在为DARPA开发一个名为ARM-S的项目,这是DARPA机器人挑战赛的前身。基本上,DARPA希望我们做的事情是拥有一个自主操作机器人来操作日常物品。所以我们自主地做着事情,比如在桌子上找物体,捡起它们,把它们移动......我们试图做出真正具有挑战性的事情,比如换轮胎,装配零件和各种东西。 我们对一般物体非常熟悉,“找到它,捡起来然后移动它”。当我们这样做时,我们开始寻找工业中低悬的水果,并且我们被带到了仓库采摘的自然轨道上。这是一个非常具有挑战性的问题,我们相信我们可以带来价值,而且操纵真的是那里的关键区别,能够看到和操纵这些产品。 你能否把我们为什么其他公司没有这样做呢?为什么它如此具有挑战性? 你必须意识到的是,在技术处于今天的状态时,现在没有人会解决所有的挑选问题。我们不能让一台普通的机器来完成这一切; 你必须挑选你将要解决的问题。我们开始关注一些较低的挂果,其中包括消费品,盒装东西,瓶装东西,制药等。起初,我们选择吸盘来提高速度,因为这个行业的头号事情就是我们和一个人一样高效。我们所有的东西,甚至是CMU的所有学术资料,都太慢了,我们不得不从头开始。这就是我们所做的。 当我们开始这样做时,我们从头开始编写整个感知管道,并试图让机器人在半实际的情况下尽可能快地选择事物。当我们这样做时,我们对吸杯的选择感到惊讶:它超出了我们的预期。而且,我们也可以真的非常快地选择它。有了机器人,这的确是知觉速度和手臂速度。我们解决的是感知速度 - 我们看到这些产品真的很快,手臂速度成了瓶颈。 一个持续的[人类]挑剔工作者,实际上每小时不能挑选超过600件产品,而且这种旅行没有太多。我认为我们已经完成了每小时1100件产品的演示,机器人坐在货架旁边。我们对此非常鼓舞,而且我们的机器人具有某些应用程序的事实,我们无法做到这一切,我们很好。如果我们能够剔除几个关键应用程序,那么我们就有时间来扩展我们可以选择的范围。 IMG 照片:IAM机器人 Swift并没有动态地计算选择它运行的每个项目的最佳方式,而是查询一个数据库,该数据库包含已经以3D扫描的项目,建模和分析以找出最佳可能的抓取策略。 你的采摘系统有什么样的限制? 它必须相当有条理。产品必须互相支持,因为我们采用它们的方式,我们从前面推动它们,如果你的产品没有任何背后的光产品,它们会倾倒。大部分工作都是通过手动方式完成的,人们将继续将事情组织得更加有条理,但除了通过自动化拣选方面,您还可以节省五倍于现有员工的成本。 采摘的经济性是这样的,当你把东西拿走的时候,你会用一箱产品走到一个地方。但是,当你选择它时,你必须走回十到二十次来选择所有的产品。在挑选方面有更多的工作。所以我们认为,让我们自动化。对客户的好处是,他们可以为人类和机器人提供非常高效的采摘,人们可以更轻松地工作,并且您拥有一个组织良好的仓库。 你能谈谈你用于感知的硬件和软件吗?因为这似乎是这个挑战的重要组成部分? 是啊,就是。我不能透露太多,因为这有点像我们的秘密酱料:我们在现成的彩色相机和深度传感器之上构建的感知管道。我们所做的是我们已经对我们所说的快速视觉算法进行了分层,这些算法可以非常快速地处理数据,提取我们需要知道的关于选择的信息,并且匹配已经为系统进行了视觉训练的信息。[视觉训练来自]采用我们的快速视觉系统,并将其打包成独立的小型影楼扫描仪,我们称之为Flash。这只是一个独立的微型照相亭,我们在每个过道上下走一次,条形码扫描每个产品,并将产品样品放入扫描仪。然后我们收集图像,对产品进行加重,看看它有多沉重,我们收集尺寸,表面特征,这些类型的东西。 好的,你提前做好所有这些数据收集工作,以便机器人已经接受了如何挑选每件物品的培训? 是的,一点没错。我们将客户的仓库管理系统与客户联系起来,并且与我们从Flash获得的数据结合起来,向机器人发送视觉选择列表。 听起来相对较小,四四方方的东西是你的目标市场。是否有潜力进入更普遍的电子商务?我想这可能会更难挑选。 绝对。我们有两种模式可以让我们进入这些常规应用程序。一个是做像回收箱这样的东西,几乎就像Kiva系统的一个轻型版本。我们只需检索一盒任何产品,而不是整个货架。我也认为杂货电子商务将会有很多惊人的应用。 为了解决这个市场,你将如何改变你的系统? 我们不能在第一天选择生产或类似的东西。我会说杂货店里75%的东西都可以用吸盘选择,但我们只需要在尺寸,形状和重量方面更宽广一些能力。所以不只是一个吸盘,而是一个阵列,使我们能够挑选大量不同的尺寸。除此之外,这一切都很好。 这听起来像是有充足的理由来使用吸气式气爪,但在过去的几年里,这个领域有很多创新。你有没有想过尝试其他抓手设计? 有Grabit,有帝国机器人......我想他们都有自己的利基。机器人学是一个非常受物理约束的世界,事物必须在尺寸,重量和力量方面非常精确。这只是一种类型的机器将能够抓住和拾起而不是另一种机器的现实。所以在某种程度上,我认为我们会看到这种粒度和差异,我们需要能够支持所有这些来填补所有的利基。 人类只用一种夹具即可。你是否认为一个更具拟人化的设计可能是一种现实的广义选择方法,或者你认为这将是其他设计中能力较差但更宽容的设计之一吗? 我不认为在短期内这是一种现实的做法。我们在RMS上使用了拟人化的手,而且我没有按照他们的确切位置刚刚出售。几十年来,为什么吸盘和双指夹具一直是自动化行业普遍采用的抓取技术,这有很多原因。直到我们看到软件甚至一些硬件设计的重大升级......在某种程度上,我认为我们想在IAM机器人公司做些事情,但它还没有完成,我们只是想在我们扩展之前,请在我们的腰带下获得一些杀手级应 IAM Robotics今年在Modex工作,Tom说他们遇到了150多个对他们的技术感兴趣的不同组织。这对于一个小型机器人公司的有史以来的第一个贸易展来说不是一件坏事 在接下来的六个月左右,IAM机器人将专注于与罗切斯特药物建立成功的初始部署,并开始与其他客户安排额外的试点项目。如果你想要,他们目前正在接受2016年末的订单。
felix
2018-05-12 21:27
这些都是方便吸取的SKU,但是上架可是个体力活啊。
All Rights Reserved Powered BY WeCenter V4.1.0 © 2023 京ICP备16065701号