- Speaker #0
Hi everyone, good morning, good afternoon, good lunch as well from wherever you are around the earth and I'm happy to have you and to welcome you to this session of the UA Goes Live, LinkedIn Live, the first of this year. This session will be a bit special because the topic is about AI services and we have seen a lot of interest from you when we put on the title AI services and all the content that we will share with you. So we decided to do like this and give you more, meaning that today is the part one of the AI services and there will be next month the part two. So please enroll yourself as well for the part two. The invitation will come next week. And today what will be the topic? Today the topic will be also on the LTU University. Sweden University and the professor, Professor Vietkin, the CEO of FlexBridge as well, which is one of the research, university research, let's say, company within the university. Hi, Valéry. Come on stage with me. Hi, Valéry. You need to unmute yourself.
- Speaker #1
I'm sorry. Yes,
- Speaker #0
I can hear. Hi, Valéry. Hello,
- Speaker #1
everybody. Hello, Greg. I'm really happy to be here. this afternoon and yes thank you for the introduction i'll talk today about our recent developments in the area of assisting software developers with ai power based
- Speaker #0
on 647 yes oh yes of course what else what else let me let me show show your screen and you can go okay so let me switch to my screen
- Speaker #1
I'm talking today about a function block assistant. My name is Valeriy Vyatkin. So first of all, a couple of words on who we are. FlexBridge is a startup, spin-off research lab at Luleå University of Technology, located in Luleå, northern Sweden. And our expertise is in IC61499, decentralized control systems. We also do professional education. And our role, and why I'm speaking today on behalf of FlexBridge, our role is to fill the gap and complement the university with capabilities of companies who can make prototypes of real products and bring it to the market. So that we see as our role. And we are very motivated by the challenge of... industry flexibility. And we see the solution for industry flexibility in a decentralized swarm intelligence. So in the shift of production systems to be automated in a distributed way and controlled with a swarm of interconnected control devices. So from that perspective, If IEC 61499, is the only bet these days to have system-level software design for this kind of decentralized automation systems. And that is why we are heavily into IEC 61499. But automation systems, this view on automation systems is impossible to implement. if we do not have the corresponding hardware. And FlexBridge, some years back, has developed a totally different kind of automation device that is a very small node, but very powerful node that is directly connected to some set of sensors and actuators, but also on the other end it's connected to modern communication networks such as Wi-Fi, Wi-Fi 6, but then later also to cellular such as 5G. And this small node we see as a basic block, we call it ice block, a basic block of future automation systems. So we have prototyped this concept through several research projects and activities. You see several on the screen. For example, at Aalto University, Demonstrator was developed for heterogeneous automation system on the umbrella of universal automation, where Iceblock is a part of. We've been into European project with Italian company Repack and automated machines from Repack using Iceblocks. We've been also collaborating with Spanish company H-Sol, Hovering Solutions. to automate payload on the drones using IceBlocks, drones flying in the tunnels. So here you see this universal automation heterogeneous demonstrator where IceBlocks is a part of, along with the devices of other universal automation members, inter-operating and collaborating in a single small example. And we have also used the 5G version of Iceblock devices on the same testbed with indoor access point to 5G and proving high throughput and high reliability of communication using this technology. So, another ongoing project with Iceblock has been and is ongoing yet is a project on using 61499 and 5G in decentralized grid protection. So we have built a testbed where ICE blocks are connected to a powerful simulator of a smart grid system where we reproduce automation of a substation, digital substation, and ICE blocks represent controls of primary commitment within. power distribution substation. Okay, so but today I'm going to talk about totally different thing, not hardware, not ice block, about software, about software tool that we are prototyping and developing at FlexBridge and one motivator for this development has been the recently started European project Medusa. that is manufacturing as a service framework, exploring decentralized secure data exchange and promoting sustainability and circularity. So in this project, where FlexBridge is a partner, we are focusing on a tool that is a function block assistant. And this is what I'm going to talk about in the next part of my talk. So. But I need to start with a little introduction. In IEC 61499, as many of you know, a basic way to represent the control logic, one of the ways, is to program basic function blocks using state machines. So here is an example from one of our previous projects. One commercial machine was implemented, control of that was implemented as this. a state machine called the N61499 execution control chart, but basically it is a state machine where in graphical form we represent the sequential logic and behavior of input output behavior of the controller. So these state machines could be quite complex and the challenge is how to create this from specifications, from the requirements that automation engineers have. Today I'm going to walk you through this process. Okay, so let me first explain the workflow of the FBS system tool. The basic element of basic technology support that we need for developing IEC 61499 applications is integrated development environment, right? So we know that there are several such environments. For example, Fodiak Eclipse IDE or EcoStruxure Automation Expert IDE. So this IDE is a basic tool that the developer is using. And the developer is creating function blocks, saving function blocks, and building applications from function blocks, storing the function blocks in the repository. And then, when an application is built, it is deployed. to a compliant PLC from integrated development environment and then it starts controlling a real process in the plant. We also often use so-called soft PLCs, where we deploy code on the same machine as we run integrated development environment. And often we use visual simulation to support the commissioning and debugging process of our automation applications. So this is the basic of 61499 development. So where is the function block assistant here? So what is the role of function block assistant in this workflow? The developer communicates with the function block assistant using natural language and provides requirements in the natural language. Then function block assistant from this natural language requirements creates state machines and lets the developer check those state machines. and reply whether these state machines, in the opinion of the developer, correctly represent the requirements. So there could be several steps or several iterations, upon which the developer comes to the conclusion that now the state machine at some stage is good enough and represents the requirements correctly. At that point, the function block assistant can connect to the running a simulation of a process which is running in the soft PLC of IEC 61499 and do quick simulation of the logic in the direct connection with the model of the process. And after this quick simulation satisfies the developer, the FB assistant can generate a function block out of state machine and put it into 61499 repository where it can be immediately taken and used in the classic cycle of 61499 development. So this is a workflow that we envisage for the function blocker system. Now I'm going to walk you through some very small examples of using function blocker system in which you could appreciate better what is going on there. I will use really Mickey Mouse examples, and I have pre-recorded a video to avoid some longer delays with a real live demonstration. So it will be the pre-recorded video, but I will speak over that video. So to illustrate the process, we start with the first example of a pneumatic cylinder, double-acting pneumatic cylinder, which has two... actuating signals move forward, move backward, and two end position sensors indicating the left position and the right position. And also it would have several buttons, as you can see on the screen, through which operator may interact with the process, with automated process and the start, stop and so on. So we start with... testbed created a software testbed where the controller function block of the cylinder is connected to several function blocks representing model of the cylinder itself and models of the buttons so this is our virtual commissioning environment where we will want to test automation logic but this function block itself is empty you see that inside it's a dummy controller block It's a basic block in which there is nothing. So we have an empty state machine inside and because of that, if we try to press the buttons in the simulation, nothing will happen because there is no logic inside. It's just an empty block. So as a starting point for development, we open now a function block assistant and we also in the function block assistant we start with a dummy empty state machine. So it's a module that has a couple of inputs and one output and has an almost empty state machine with just two states. A state for the start and state initialization where we set the dummy output to false. So that is our starting point. Now, the next step will be to add requirements in natural language. So we select some requirement like this. At first we want to tell the assistant something about our system. So what are inputs and what are outputs? And what is the meaning? And after that, we will give another requirement, which will explain what the system expected behavior is. So, so let's start with the first one. And we now have this small strip with a text editor where we can select the requirement and insert it to the chat with artificial intelligence. So and then see how artificial intelligence would modify our state machine. That's our dummy state machine so far. So selected this text, go insert it to the buffer and then insert it to the chat. Send it to AI to reason and react on that. So AI have, after some thinking, creates another version of this state machine called dummy A. Now it has, as you can see, populated interface with all this list of inputs and outputs that we described in the natural language. But state machine itself hasn't changed. Only the interface has changed accordingly to the provided requirement number one. Now we select requirement number two that describes what we want the cylinder to be doing. And we say that first it has to retract to the initial position, then it has to operate in the cycle and reacting on the start button it should move to the right and then wait and reacting on the next start button. press move to the left. Okay, so we again do the same trick. We insert it to the buffer, we go and give a new name to the expected result, call it cylinder control 1, and then go to send it to your AI. Now AI has created the new state machine based on this requirement. And now this state machine already looks interesting. So this state machine describes the behavior of the cylinder according to the requirements. So first retract to the left position and then engage in the cycle reacting on the start button. So that looks quite nice reasoning of AI and we need to analyze whether it is good enough. So here we can see that in the init state it's still assigning some dummy output variable, so we have not reacted on that in the first instance when it created the dummy A version of the state machine here and it propagated through to the next step. So now we want to correct it and we type in here in the chat. that we want to get rid of these dummy inputs and outputs from the created state machine. So, a little editing here in the chat. And after that we will again pass this recommendation to AI. with the hope that it will react on that and correct our model, state machine model. Okay, so selecting, inserting and finally sending. So... Here is a result. The assignment of dummy output has been removed from the init state. So now we can see that the state machine becomes quite correct in the eyes of the developer. So it takes first cylinder back to retracting to the initials at left state, at left position. And after that... It goes through the cycle reacting on the start button. The developer should be happy that the requirement has been implemented through this state machine logic. And now what we can do in this function block assistant, we can go and connect to EcoStruxure Automation Expert soft PLC directly from the assistant. So we have here the magic button that we press. and with that we start simulation. So we start the simulation and you see that in the lock window that it goes to state init, then goes to state left, then changes to state at left and here it's supposed to wait for the button press that is in our HMI. So we press the button here. and the cylinder starts moving. So we directly, with a state machine created by AI, able now to control the cylinder. And as you remember, the control logic inside Automation Expert is empty. So we are talking from a function block assistant with communicating with the runtime of the soft PLC. in the Automation Expert. Now, once we have validated this way, in a quick way, from the state machine, we validated the logic is correct, the next step in the use of Function Block Assistant would be to generate the function block based on this logic. So we select this version of the state machine, go to the generate FBT button and see that the function block file is created. So then we switch to EcoStruxure Automation Expert and see that it's immediately detected the new function block being sent to it and we need to reload the solution. So we reload the solution, select the function block and see here is a function block created. auto-generated from the state machine. So you can probably see that they are quite similar in the structure, but the layout is a little bit ugly. We still didn't work on the layouting too much. So now we have this function block within Automation Expert and we can just go and run the whole system and test and see how it's going to work in EcoStruxure automation. So we here click and substitute the dummy controller by the newly created auto-generated function block type and then we need to start and run this function block application deployed to the device. So yes, we save it and then go to deployment, go to device, login, because device has been unlocked after we re-uploaded the solution. So no syntax errors. So it's everything created correctly. So we start HMI and then... Test it by pressing the start button. Again, it works as expected. So, second press and it returns to the home position. Okay, so with this we have demonstrated the first use of function block assistant on the Mickey Mouse example of pneumatic cylinder. Let me... Now, reiterate what we have seen. So, we have seen that we started the development with a natural language requirements and with an empty function block, with an empty state machine. Then, with a couple of iterations, we achieved the working state machine, correct logic. We tested it. in a quick simulation. connecting to soft pills here on time and after that we generated the function block, imported it into 61499 IDE and run it as a part of 61499 application. So that is the first kind of very quick and dirty hello world example of using function block assistant. Now I have another example. I want to walk you through a more complex example of state machine development using function block assistant. That was a series of experiments where we again started with an empty dummy state machine. But now for this pick and place manipulator, we have... model of that pick and place manipulator developed long ago and we used it in our research. So I just implemented connection and implemented development of control for this module. So we start again with requirements written in natural language and first we give the assistant the list of inputs and outputs. And we intentionally... I intentionally omitted one input just to see how it will... would it be able to create control without that. And then we will give it also description of what we expect this manipulator to be doing. So we will ask it to modify the interface and then... add the control logic to that. Okay, so in the first step, the assistant has created, has correctly modified the interface and created this state machine, which has a lot of operations implemented as ST programming language within the state. So then we react on that saying that that we don't like this design and do not use if-then-else statements in the state actions and develop fully state machine-like logic, not combining many assignments in one action. So we give this requirement and in response it creates this state machine. So this state machine already looks... as a good start. It doesn't have obvious errors, but still some complaints. For example, it returns in the cycle to the init state, which is not a good thing because initialization has to happen once. So we ask to create a separate state for that and it creates a separate state. But some states are not populated with any actions, like this drop-off state that was created. doesn't have any actions and then we again write it a letter asking why is that. So this whole dialog has taken several steps and all these steps are recorded in this log. So at every step, Assistant makes some corrections and We proceed step by step through the process of achieving the desired behavior of our system. So, the next requirement, we ask why it would turn the vacuum before reaching the workpiece, which was the case in the step four in one state. the assistant would correct that now the vacuum is no longer turned on it turned off in all this extension states pick h1 pick h2 and pick h1 h2 so and then at the place stated that it turns vacuum on so so we go through these uh requirements uh in uh in the dialogue with our assistant refining the state machine to the liking of the developer. And at some point we need to remind it that what, how to reach each tray because
- Speaker #0
In two cases it correctly reaches the tray, but in the last state, in this H1-H2, it wasn't correctly setting output signals. So we corrected that, and we go through the several steps again, and we finally arrive to the step 16. So here in the step 16, we have... the resulting state machine that looks like this. And the visual check tells that that seems to be correct, so now maybe it's time to test it. So I will just generate a function block type, go and check whether this logic works correctly in EcoStruxure environment. Yeah, so this is ECC based on the state machine, auto-generated. So we correct, put the name of this newly generated block again here. So we take control 16 and now we will go and deploy and start. So after we deploy it, we press like what and put the workpiece and see the behavior. And oops, that behavior is not 100% correct. On the return, retracting it's doing some strange thing. You see, it's extending the vertical cylinder while moving to retracting the horizontal ones. So we can complain again in natural language saying that you shouldn't do that. And let's see how... the Assistant would react on this complaint. Save, as you already familiar. Insert and send. So, what it has created is a version 16a. In this version 16a, if we check the retract state, here in the retract state it correctly removed, it doesn't turn on the vertical extension. Now it is false. So, in the previous version that we tested it was on. Now we generate another function block type and test it in the EcoStruxure environment. So we correct this function block. Select the version 16a and deploy. So, let's see if it works correctly. Now we insert workpiece. Yes, the problem has been corrected. Let's see how it reacts on workpieces being in all three trays. Okay, so that's been a small demonstration of... capabilities that our prototype function block assistant tool can do. And now, maybe any questions? I'll be happy to have some discussion with you.
- Speaker #1
Great, Valéry, thank you. Thank you for your time and thank you for this explanation of how to write with, let's say, natural language, how to write function blocking. in 649 that was very very very uh enlightening and interesting um one question so i i don't see any question from the screen but uh don't don't feel shy to ask your question and valerie is there and we'll be happy to answer them one question which came to my mind is first uh yeah we saw that in the way you are doing your prompting for your ai you always start like with input outputs you gave that to your eye machine and then you look at the state machine that you successively let's say improve is that why are you doing this way is that like for experience that you should always start with more input output and then from
- Speaker #0
the state machine or or what is the the reasoning behind yeah thank you for the question uh i think it's um it meets my kind of common sense uh expectation first When we start from scratch, We need to tell what the system is, what our automation is. And it corresponds to any development of any automation code. So first, you need to know what are your inputs and outputs. What does object have, right? And this is a basic kind of part of the prompt that we give to the assistant. What are IOs? What do they mean? And what then is... expected behavior description. And after that, the assistant is capable of creating initial version of the automation logic. And then we do in successive steps, iterations to improve it and refine it.
- Speaker #1
And reach desired result.
- Speaker #0
Yeah. And eventually converging to... Result that is reasonable.
- Speaker #1
And can you share with us which tool in behind did you use? Which AI did you use behind?
- Speaker #0
Currently, we are using ChartGPT as a backend engine. Yes, of course. Yes. But we are working on different options.
- Speaker #1
Okay. So you can say that ChartGPT is already a bit trained to... to help us draft some function blocks as well with the tool. That's very good news. And something else is that, and I see the question coming as well in the chat, this function block, this FBA assistant, so today it is, you use the result to, let's say, copy and paste in a construction automation expert environment, but is it like an add-in tool? of a construction automation expert or is it like a standalone tool that you could reuse with any 649 designing engineering at the moment it is standalone tool but we developed we found ways how to talk
- Speaker #0
to both runtime and and the the tool itself so yes and with the data exchange we able to do quite seamlessly operation if we run it side by side, the developer could use FBA system side by side with
- Speaker #1
EcoStretch. To test and to simulate and to see what is the behavior which is drawn by the tool. And using some AI, because some of you may have used AI in the past and see as well that the limitation linked with AI and especially with the prompt engineering is sometimes the... the history of the prompting that you know like the ai when it has too many information it kind of loses his his mind and forgetting what are the information yes and uh did you test that and how did you see a convene that that behavior from the ai i
- Speaker #0
think that's a that's a part of our approach to avoid this so we are trying to carefully feed ai and
- Speaker #1
uh feed it with some kind of uh past knowledge first and then uh do it avoiding two big steps that's that's the trick i think that's uh that's a very good trick because it's always for those using ai that's always a bit the problem when you want to refine something with many iteration that at some point the the last iteration is like losing the first command that you ask and you You need to read. ask everything from the kind from the start so that that's a good way a good way to circumvent that i'm just reading the chat i see um yeah very good question is the function block assistant capable of creating only one basic function block or other function block like composite function block and i think the question behind is yeah you know we have a full system And at the end, we have one block. We have all this. algorithm, this ECC inside, but is there any subcomponents that we can have to kind of ease the...
- Speaker #0
I cannot reveal too many details. What we have shown today is capable of creating one basic function block, and then we will progress from there.
- Speaker #1
Yeah, that's very good. And in fact, I very much like your approach, which is very much asset-centric. So you really, you look at the assets, you look at your inputs, outputs. From there, you say, okay, those are my inputs, outputs. Let's start with that, because I know that my user can touch that. I know which are my action, okay, left, right, vacuum, go up, go down, and so on. And really just from that say, okay, I want to have this behavior, which is very, in fact, very natural, a very natural way of... of doing things and maybe an additional question uh i don't know if you have the the answer to that but uh we saw a lot of um what is used let's say as a programming language is structured text uh we saw mainly and is there a way to use other like programming language like python or or any other css and so on or is it like within within function block or within state machine within the function within the state machine within the state machine
- Speaker #0
Well, of course, it's not a big barrier. I want just to give a small hint. State machines is actually a basic design instrument, not only for 61499. Yeah. It's a universal design instrument for any digital system, right? So, including also 61131. If you have a state machine, there is a very mechanical approach how to generate 6.11.31.
- Speaker #1
Yeah.
- Speaker #0
And I cannot promise anything, but it would be stupid not to implement that.
- Speaker #1
The word is out there. The word is out there.
- Speaker #0
And speaking about like instead of using within state machine. We do not use any particular syntax of any particular language. So we just have taken ST because of convenience, but it's quite easy to tweak to any other type of internal syntax within state machine. So the state machines we have been playing so far with, they are quite elementary, have quite elementary operations within states. So we were more focusing on ability of AI to build a kind of complicated, interconnected logic within state, not over-complicated inside of the states. So that was the focus so far.
- Speaker #1
And that's quite interesting.
- Speaker #0
Because I believe that this is a way how human refines requirements. When we have natural language requirements, they in most cases are full of mis- conceptions and misunderstandings. So we start with something and then we visualize it with the state machine and through that process we refine the requirements. So what is an Android? It's it's a kind of co-creation process of a human and AI when they they go together through this process and arrive to much better formulated requirements in the form of state machine. from what was there in the form of natural language text and after that it's quite mechanical implementation process to function block and then uploading and testing
- Speaker #1
it yeah and another question to you as as you are one of the 649 fathers let's say one of the the one implementing the the stunner We often have the question, yeah, but in fact, 61499 is a programming language. So can you maybe help to debunk a bit that? What is 61499?
- Speaker #0
Well, 61499 is definitely more than programming language. It's an architecture for system-level thinking, description of a system. So it is a way, it's a language, but it's a language to describe a distributed system. in its entirety, right? So it includes, on the top level, it includes interconnected functionalities of different nodes and different components talking to each other. And then the next layer as a device is where you deploy this functionality. So this all is a system configuration in 6.14.199. And this is quite unique. NOAA as a technology for automation provides this capability. So that's why we have to take 614.99. because now systems are getting more and more complex and this need of kind of system level view is increasing also also because of ai because because if we have software defined description of the entire system that would pave the way to using tools like our fb assistant so we can auto generate almost everything reading our minds right yeah and iterate and do it and iterate away yeah i see i see good question from zafari in the chat yes that's that's a good uh good suggestion it is a kind of uh
- Speaker #1
um so if so the question is will ai assist in debugging the existing state machine design manually yes for that we would need to implement
- Speaker #0
some way to import those state machines into our system, not to start from scratch, but start from somewhere. And this is pretty easily. Currently, it's not there, but that would be a nice next feature to implement. And after that, the developer could start continuing improving that state machine. Yeah. Kind of natural language dialogue.
- Speaker #1
Kind of starting with a pretext instead of using the language, saying, okay. I think the system could be like this.
- Speaker #0
Yes.
- Speaker #1
I put it and I put it in my AFP system and say, okay, here, this is how I think. Can you make it a bit better? Yeah. Actually,
- Speaker #0
in the project that I mentioned, the Meduza project, that is exactly what will be implemented because the goal of Meduza project is to develop technologies that help modify existing automation system to implement new requirements. And FBA system already can do that. So we just will implement the import. And we probably also could implement import of existing function block into the FBA system.
- Speaker #1
Valery, are you talking about migration?
- Speaker #0
No, no, no. I'm talking migration is different thing. I'm talking about system modification.
- Speaker #1
Good.
- Speaker #0
So if we have software defined automation system. to make production line doing different product or comply with new requirements. So this is like we need to from current state of automation system achieve a new state of automation system. Yes. And that requires changing some code here and there.
- Speaker #1
That's quite interesting. I see the next question is AI helps to identify generate inputs and outputs based on required feed. from natural languages and fp assistant is capable of optimizing existing function block that's both questions but i think well this uh the second one i just tried to speak about uh yes so it's
- Speaker #0
uh we are not far away from there uh i do not fully understand the first part of the question identify generate input and output based on requirement feed from natural language Yes, a state machine, if I understood the question correctly, a state machine is implementing the function translating inputs to outputs, right? And based on natural language, we create this state machine.
- Speaker #1
Yeah, yeah. And Thomas is saying, yeah, please come up with a series of test cases and run them. Let me know the result. That would be a nice feature. Okay,
- Speaker #0
yeah. Well, we are looking. That's one of the reasons why I was keen to give this presentation. Is that we are looking to collaborate with industry to take real industrial cases. Or cases of industrial industry grade complexity. And process them using our FBA system. So don't hesitate contacting me if you have such cases and we can demonstrate how they would be processed.
- Speaker #1
So that's a very cool to you, Tomas. You have the email of the professor Vietkin. You have our contact. We are looking for part three of these AI services, which could be directly based on that. please feel free to give us your use cases so we can run them through the Function Black Assistant and see what are the results. That could be nice. That would be nice. Okay. I think we are good with the questions so far. And I will thank all of the viewers today. We'll have a next session coming up in a few weeks on the part two of AI services. So please stay tuned. This will be quite interesting as well.
- Speaker #0
Yes.
- Speaker #1
And please come up as well with some of your use cases. Please send them to us so we can even do a part three of AI services and just running your... your own use cases that would be very nice absolutely thank you all thank you valerie thank you very much and talk to you soon bye bye have a nice weekend bye bye yeah thank you bye