在与《计算机和自动化》杂志主编Günther Herkommer的访谈中,Hans Beckhoff回顾了最近20年PC控制的发展,并对未来做了展望
Günther Herkommer:Hans先生,在1990年代末,曾经在商业杂志上有一场大讨论:工业PC正在工业自动化领域获得认可,而PLC正在走向末路。现在我们看到前者已经成为现实,但PLC还继续活跃在历史舞台上。您如何看待这最近20年的控制技术发展?
HansBeckhoff:20年来发生了很多事情,尤其在工业自动化领域,几乎每年都有令人激动的创新,有时甚至是革命性的创新。但这些创新对于市场的真正影响,往往需要在10年后才能看到。
在Beckhoff,我们在1986年交付了第一台工业PC,这意味着从那时起我们开始拥有PC-based控制技术。1990年代早期,我们首次参加汉诺威商业展会时,一位记者问我PLC还会存在多久,当时作为一位年轻的工程师,我自信地告诉他,还有5年。在那时,5年对我来说已经是一个非常长的时间了
1995年这位记者又问了我同样的问题时,Beckhoff已经借助技术能力飞速发展并运营良好,但PC-based控制技术在市场上的份额还是可以忽略不计。
一方面,我们上面提到,技术产生影响力需要时间,另一方面,供应商们都有一定的惯性,这种惯性鼓励他们使用经过验证的技术,比如PLC技术。
不管如何,我们相信IPC技术是目前为止最强大的平台,经常也是最便宜的。而且,它是IT和自动化融合的最佳平台。
Günther Herkommer:2000年左右,以太网开始出现在工业场合。2003年,Beckhoff自己发布了EtherCAT技术,EtherCAT是一个完整的技术方案,现在已经被国际上广泛接受。当时你考虑过EtherCAT会如此成功吗?
HansBeckhoff:我们当时确实很乐观,知道我们做的是一个很好的技术。但当时没有想到我们正在用EtherCAT定义一个全球性的标准。就如我们公司历史上经常发生的那样,我们保持着一种近乎天真的乐观,相信我们的实力,最终经过努力,我们开发出了这项技术。
当时,我们已经是总线专家了,一方面我们在市场上已经推出了我们自己的通讯技术,另一方面,我们熟知其他所有的现场总线-基本上是CAN总线和PROFIBUS。对比于其他现有的技术,EtherCAT的开发是一个真正巨大的飞跃。一方面是性能,我们找到了一种方式,用单一以太网报文从现场许多设备上来搜集信息。另一方面,我们在一开始就设计了分布式时钟,这使得我们可以把绝对精确的系统时间集成到一个自动化系统中。而另一个新颖之处是,当时所有的总线都需要一张主站卡,可能现在我们都忘了主站卡这个东西。而EtherCAT不再需要主站卡,它可以使用所有的标准以太网接口。
在得到市场的积极反馈后,我们最后决定将EtherCAT开放,供公开使用。为此我们成立了EtherCAT技术组织。技术的公开发布为EtherCAT在全世界的成功推广发挥了巨大的作用。
Günther Herkommer:除了IPC和以太网,你看还有什么技术发展在过去20年里,对自动化技术领域发生了显著影响么?
HansBeckhoff:1998年,我们可以提供一个时钟频率为1-2GHz的单核处理器用于机器控制。现在我们可以提供多达36核的处理器,以及4GHz的时钟频率。硬件技术取得了伟大的进步,或者说,摩尔定律在过去这么多年证明了它的正确性。我们相信摩尔定律在未来至少十年还将有效。我们现在将图形处理或者检测技术集成到控制系统中,也可以在一台机器上同步100个轴而不再是20个,并且同时还能做路径控制,这一切都归功于计算机性能的提升。
过去20年另一个关键的发展是多个功能领域的结合,比如安全技术和标准控制技术的集成。在驱动技术方面,比如我们的新产品悬浮平面电机XPlanar--我们给他取了个昵称叫飞毯,还有基于往复直线电机的XTS,都已经成功推向市场。我看到未来专用磁性驱动形式的趋势,现在他们已经可以通过算法来进行控制,这意味着机器上很多机械的工作可以被软件功能所取代,这是我的一个基本判断。
软件方面是最为突出的,近20年来IT行业比以前任何时候都靠近自动化领域。以TwinCAT3为例,TwinCAT3将许多工具如VC、C++和IEC61131集成到了微软的Visual Studio中,带来的好处是可以集成Matlab的Simulink、检测技术和图形处理,这无疑是更大的进步。简而言之,我觉得这种来自不同领域甚至不同公司软件的不断集成是过去20年中最重要的技术发展趋势之一。
总之,追溯起来,自动化技术正在变得更简单,更便宜。想想看,在20年前,单电缆技术和电机的电子铭牌技术要么不存在,要么非常罕见。在同样的时间内,控制领域每个轴的成本下降了20%-40%。
Günther Herkommer:完全PC-based或可以自由编程的安全技术,这个话题在六年前已经出现在Beckhoff的进度表上,但目前还没有看到可以推向市场的解决方案。为何这个技术如此之难?
Hans Beckhoff:在此我们需要考虑两个不同的问题。首先,大约十年来我们已经在提供基于安全的硬件,比如输入输出端子和安全逻辑端子。他们可以通过图形编辑器进行自由编程,并覆盖了80%的标准安全功能。我们也决定开发不带硬件CPU,只基于runtime的纯软件方案。我们已经开发了数学基础和特殊的编译器技术来完成它。在我们公司内部,这已经是一个完成的产品,只差一个简单的图形编辑器。明年年底这项产品即可完成并且会做官方发布。
Günther Herkommer:工业4.0成为工业领域的重要话题已经超过5年,你们如何定义工业4.0,如何看待今天的工业领域?
Hans Beckhoff:工业4.0是个复杂的话题,尤其是经常会涉及到诸如数字化和IOT等流行词,所以你的问题不容易回答。
我们先从数字化开始,数字化始于1970年代。伴随着电子数据处理辅助技术,硬件和软件正在越来越多地渗入到生活和工业的各个领域。就此而言,我不认为技术上有什么飞跃的发展,但它在加速。德国工业的强大竞争力表明,相对于其他国家,德国企业在数字化方面做得相当不错。
2011年,德国国家工程院发布的模型定义了工业4.0,在这个模型上,我们直到不久以前,还处于工业3.0时代。在工业3.时代,生产环境的一大特征是机器的本地智能化。工业4.0时代目前刚刚开始,它的特征是本地智能和云智能的结合。这是我对工业4.0的主要概念,比如机器可以通过云互相交谈,可以呼叫云上的服务,并将其用于生产过程。反之,高等级的智能也会把这些机器看作是他们外部输出的扩展
在Beckhoff,我们能想象一些机器智能正在朝着云方向转化,我们称之为“阿凡达概念”。一个例子是通过云上运行的语音识别来操作机器,或者是通过振动分析来做可预见的诊断,这些诊断并不需要在线进行,但可以在云上离线处理。在今天,我们可以将PLC进行“云化”,这依赖于性能、带宽、和响应时间。但即使是现在看来可行的5G技术,响应时间也会高于1ms,像包装机械这类设备,就不能通过这种方式来控制。
现在我们可以做个推测,未来20年,通讯技术会怎么样?个人以为,凭借特殊的交换技术和无线技术,会有100G的带宽,我们能够将集中控制的响应时间缩短到1ms以内。20年内,你的同事会这样回顾:2018年,机器正犹豫不决地开始和云沟通,并从云端获得服务,但在现在,这可是机器地自然行为。
Günther Herkommer:你觉得未来20年,哪些技术会给自动化行业带来关键的改变
Hans Beckhoff:在所有因素中,机器智能的基础还是硬件。由于摩尔定律还将在未来几年继续有效,在20年内,我们一定可以甬上比现在强大100倍的计算机。这意味着你能控制比现在多100倍的轴或者相机,或者是以快100倍的速度控制一台配备许多相机的机器。在硬件发展方面,以图形处理或者传感器为例,它们的应用会获得极大的提升,而不仅是在产品检测上。
另一方面,随着计算机能力和通讯带宽的提升,云的计算能力至少会得到相同倍数的提升。
云上会发生什么,从根本上来说是取决于工程师的想象力。
目前AI和机器学习的概念已经出现,它们会在近2-3年内对机器的功能产生影响,而不是在20年内。
在Beckhoff,我们已经成立了工作组来调研人工智能算法可以用在自动化的哪些应用,包括机器人路径规划、传感器数据融合等。
目前取得的成果还是非常有前景的。
Günther Herkommer:在工业4.0时代,越来越多的传统IT公司和互联网公司像谷歌、亚马逊等都想到工业自动化领域分一杯羹。目前的自动化技术供应商们是否会有失去话语权的风险呢?
Hans Beckhoff:我不这样认为。毕竟,大的IT公司,像谷歌、微软、SAP,他们都从顶层开始接触应用。换句话说,他们引入了边缘计算的概念,而边缘计算反过来包括了机器控制智能化等本地的智能化。在这个方面,传统的自动化技术厂家凭借其知识基础具有领先优势,毕竟自动化技术是十分复杂的。因此,我不担心谷歌会突然提供运动控制技术或者是更为复杂的测量技术。更重要的是,自动化市场对于这些公司来说,实在太小了。
这些大型IT公司感兴趣的是数据,凭借数据他们可以演算出带来更大利润的商业模型。而控制器和机器制造商可以提供这些数据。
Günther Herkommer:是否认为,在未来吸引机器制造商的是商业模型,而不再是单纯的控制硬件呢?
Hans Beckhoff:就目前而言,自动化供应商和数据处理商这两个角色之间肯定会有竞争,另外,但机器的最终用户也会因此而发展他们自己的策略。
Günther Herkommer:不管怎样,我们总是听到这样的说法,数据是21世纪的石油。为了部署这些数据驱动的新商业模型,用户必须要准备好数据。数据缺乏是他们不能工作的主要原因之一么?
Hans Beckhoff:让我们做个正面的推断。首先,我认为德国对于数据安全的恐惧感要远甚于其他国家。
但如果你希望能够成功开发商业模型,必须要放下恐惧,并考虑你可以从中得到什么。在德国的AI社区,乃至联邦政府的“关于AI的关键事项论文”中,都有建议开发一个匿名的通用数据库,在这个数据库中,个人的真实数据可以被导入,并当作一个匿名的通用数据池用于各种不同的用途。
这会有很多可操作的方式,比如,我们曾经和一些客户有协议,他们会时常在机器上运行一个测试周期,这个测试周期不会记录生产数据,但通过这个周期获得的数据会用于机器的可预见性维护。
简而言之,针对数据安全,会有办法解决。我总是建议先不要太担心,要从正面的角度来看待目前可以有的其他选择上。
英文原文:
Looking back and looking forward
In an interview with Günther Herkommer, editor-in-chief of the Computer & Automation trade journal, Hans Beckhoff gives a review of the last 20 years in PC-based control technology and an outlook on future developments.
Mr. Beckhoff, at the end of the 90s, there was major debate in the trade journals: Is the Industrial PC gaining acceptance in industrial automation and is the traditional PLC on the way out? The former has happened, but the PLC is still alive and kicking. How would you sum up the last 20 years of control technology?
Hans Beckhoff: A lot changes in two decades. In automation technology in particular, there are exciting innovations every year – sometimes even revolutionary ones; however, the actual impact on the market is not usually seen until 10 years later.
At Beckhoff, we delivered the first Industrial PC back in 1986, which means that we have had PC-based control technology ever since. And as early as 1990, during our first presentation at the Hannover Messe trade show, a journalist asked me how long the PLC would still be around? As a young engineer, I leaned back and said: Another five years – an incredibly long time for me at the time!
When this journalist asked me the same question again in 1995, Beckhoff was doing well and we had grown wonderfully with our technology – but PC-based control technology only accounted for a negligible share of the overall market.
On the one hand, this is due to the time constant mentioned at the beginning. On the other hand, there is of course a certain inertia on the part of the large suppliers of control technology, which encourages them to stick with the tried-and-tested technologies – such as PLC technology.
Nevertheless, we are convinced that IPC technology is by far the most powerful and often the least expensive platform. It also is a platform that enables the best-possible integration of IT and automation features.
Also around the turn of the millennium, the hour of Ethernet began to strike in the industrial environment. In 2003, Beckhoff itself presented EtherCAT, a corresponding solution that is now internationally widespread and accepted. Did you expect it to become so successful?
Hans Beckhoff: We were indeed optimistic and knew that what we had was something good. But we weren’t aware at the time that we were defining a kind of global standard with EtherCAT. As it happened so often in our company’s history, we progressed with a certain ‘naive’ optimism and belief in our own strength and developed this technology out of our own conviction.
At that time, however, we were already seasoned fieldbus experts: On the one hand with regard to our own communication systems, which we had already launched on the market. On the other hand, we also knew all the other fieldbus systems – essentially CAN bus and PROFIBUS. Compared to all these existing solutions, the development of EtherCAT ultimately represented a real quantum leap: On the one hand in terms of performance, which we had optimized in such a way that we could use a single Ethernet telegram to collect bits and pieces of information from many participants in the field. On the other hand, we built distributed clocks into the system from the outset in order to integrate an absolutely accurate system time into an automation system. Another novelty we introduced: At the time, every bus had to have a master card – a fact that is almost forgotten today. With EtherCAT, this was no longer necessary; instead, the system could be operated on any standard Ethernet port.
After the first positive reactions from the market, we finally decided to make the EtherCAT technology available for open use. In this context, we founded the EtherCAT Technology Group. The release of the technology has certainly contributed significantly to the worldwide success of EtherCAT.
In your opinion, what other developments – apart from IPC and Ethernet – have had a significant impact on automation technology over the past 20 years?
Hans Beckhoff: In 1998, we were able to offer IPCs with one CPU core and a clock frequency of 1 to 2 GHz for controlling a machine. Today we supply Industrial PCs with up to 36 cores and a clock frequency of 4 GHz. This shows that hardware development has made great progress – in other words, Moore’s law has proven its validity over the years. And we believe that this will be the case for at least the next 10 years. If today we can integrate image processing or measurement technology into the control system, if we can synchronize 100 axes in one machine instead of 20 and if path control is possible at the same time, then we owe it to this increase in performance.
Another decisive development over the past 20 years has been the combination of functional areas, for example by integrating safety into standard control technology. And as far as drive technology is concerned, such new drive types as our XPlanar, the levitating planar motor system that we nicknamed the flying carpet, and of course, the eXtended Transport System (XTS) based on inverse linear motors have been successfully introduced to the market. Basically, I see a trend for the future in specialized magnetic drive forms, because today they can be mastered algorithmically, which means that a lot of mechanical effort on the machine can be replaced by software functionality.
Especially with regard to software, the last 20 years have also been the time when the IT world has moved even closer together with the automation world. In the case of TwinCAT 3, for example, this has meant the integration of the various tool chains such as Visual C, C++ and IEC 61131 into Microsoft Visual Studio. A further advantage lies in the integration of MATLAB/Simulink and then measurement technology and image processing as a result. In short: I consider this consistent integration of functions originating from different areas or even from different companies in one software package to be one of the most important development trends of the last two decades.
All in all, automation technology has, in retrospect, become simpler and more cost-effective. Think, for example, of one-cable technology or the electronic motor nameplate – 20 years ago, this was either rare or non-existent. At the same time, costs per axis in control technology have decreased by between 20 and 40 % during this period.
One topic that has been on the Beckhoff agenda for over six years, but for which Beckhoff has not yet presented a market-ready solution, is completely PC-based or freely programmable safety technology. Why does this topic seem to be such a difficult one?
Hans Beckhoff: There are two different things that we have to consider here: First, we have been supplying hardware-based safety – i.e. the input and output terminals or safety logic terminals – for around 10 years now. These are freely programmable with a graphical editor and cover around 80 % of all standard safety functions. We have also decided to do without the safety hardware CPU and replace it with a purely software-based runtime. We have already developed the mathematical basics and special compiler techniques to do so. Internally, this is now a finished product – the only thing still missing is a simple graphical editor. It will be available by the end of next year and then the official market launch will take place!
Industrie 4.0 has been a major topic in the industry for more than five years now. What is your own definition of Industrie 4.0 and how do you see the industry today in this respect – internationally as well?
Hans Beckhoff: Industrie 4.0 is a complex topic – and that’s why your question is not so easy to answer, especially because such buzzwords as digitization and IoT are often used in this context.
Let’s start with digitization: Digitization is something that the industry and the world have been experiencing since 1970. The further development of hardware and software concepts has permeated more and more areas of life – and thus also industry – with electronic data processing aids. In this respect, I don’t see a major leap in development, but rather a development that has been going on for a long time but is accelerating. The fact that German industry is still very competitive shows that domestic companies have done their homework quite well in this respect compared with other countries.
The third industrial age, in which we found ourselves until recently, was based on the Acatech model – which, as we know, invented the term Industrie 4.0 in the year 2011. In this model, the production environment is characterized by the local intelligence of machines. The fourth industrial age, which has just begun, is now characterized by the fact that this local intelligence is combined with cloud intelligence. This is already my main concept of Industrie 4.0 – i.e. machines that can ‘talk’ to each other via the cloud or call up services from the cloud and use them for processes on the machine. Conversely, a higher-level intelligence can also see the machines as an extended output arm.
At Beckhoff, we can well imagine that some machine intelligence is shifting towards the cloud – we call this the ‘avatar concept.’ Examples of this are the control of a machine with speech recognition running in the cloud or vibration analyses for predictive diagnoses, which do not have to be carried out online, but can be carried out offline in the cloud. Even today, however, we can ‘cloudify’ the entire PLC – depending on availability, bandwidths and achievable response times. With technologies such as 5G, a lot seems to be feasible here; however, the response times here are still above 1 ms – so a packaging machine,
for example, cannot yet be controlled in this way.
Now we can make a projection and ask: What will communication look like in 20 years? Personally, I think that we will then be around 100 GBaud and, with the help of special switching and wireless technologies, we will be able to reduce the response times for centralized applications to well under a millisecond. And so in 20 years, your colleagues will be able to write retrospectively: 2018 was the time when the machines hesitantly began to talk to the cloud and retrieve services from the cloud – today, this is completely normal!
In your opinion, what further developments will decisively change automation in the next 20 years?
Hans Beckhoff: The basis for intelligence on the machine is, among other things, the hardware. This will continue to be determined in the next few years by Moore’s law, so that in 20 years we will certainly be able to use computers on machines that are 100 times more powerful than today. That would mean that you can control 100 times as many axes or cameras, or you can operate a machine with a lot of cameras 10 times faster. In this respect, we believe that, for example, the use of image processing systems on the machine – also
as sensors and not just for workpiece evaluation – will increase dramatically.
On the other hand, as computing power and communication bandwidth increase, so do the cloud's capabilities – at least by the same factor. Here, too, the engineer's imagination is ultimately required to decide what can happen in this cloud. In this context, terms such as artificial intelligence (AI) and machine learning emerge – topics that will certainly have repercussions on machine functionality not in 20 years’ time, but in the next two to three years. At Beckhoff, we have also already founded a working group that investigates artificial intelligence algorithms for possible applications in automation – including path planning in robotics and sensor data fusion. The first results in these fields are very promising!
In the age of Industrie 4.0, there are more and more traditional IT companies or internet corporations, such as Google, Amazon and others, who are trying to make a mark on industrial automation. Are the established automation technology manufacturers now running the risk of losing their ‘piece of the pie’?
Hans Beckhoff: I don’t think so. After all, the big IT companies – Google, Microsoft and SAP – are approaching the application level from above. In other words, they have introduced edge computing concepts that in turn can contain local intelligence as well as machine control intelligence. In this respect, traditional machine control manufacturers are still way ahead in terms of their knowledge base because automation technology is really complex. So, I’m not worried that Google might suddenly offer motion controls or more complex measurement technology. And what’s more, the market is simply too small for these companies.
The large IT companies are primarily interested in the data because lucrative business models can be derived from it. Controllers or machine builders can supply this data.
But aren’t these business models, rather than the pure control hardware, the attractive ones with which machine builders will also want to earn their money in the future?
Hans Beckhoff: As far as that is concerned, there will certainly be competition between automation suppliers and data processors. In addition, many machine end users have also developed their own strategy for this purpose.
Nevertheless, we hear again and again that data is the oil of the 21st century. In order to implement the new data-driven business models, however, users must also be prepared to make their data available. Is this one of the major reasons why it usually doesn’t work yet?
Hans Beckhoff: Let me put it positively. First of all, I think that the fear for data security is much more pronounced in Germany than in other countries.
However, if you want to successfully develop business models in this area, you should put that fear aside and consider what you could gain from all the data. Within the German AI community and even within the Federal Government’s ‘key issues paper on artificial intelligence’, there is a proposal to develop an anonymized general database into which personalized data can be imported and then made available anonymously as a general data pool for a wide range of different possible uses.
There are also many other practical methods: We have agreed with some of our customers, for example, that they occasionally run a test cycle on the machine that makes no statement about what has just been produced. During this test cycle, data is written that can then be used for predictive maintenance.
In short: There are solutions to the problem of data security. I would always recommend not putting too much emphasis on fear at first, but rather looking positively at the different options available instead. |