问答系列(二十六) 点击:1659 | 回复:0



gongkongedit

    
  • 精华:1099帖
  • 求助:0帖
  • 帖子:14392帖 | 54470回
  • 年度积分:0
  • 历史总积分:622
  • 注册:2008年9月08日
发表于:2002-10-28 14:09:00
楼主
I have a question concerning DeviceNet. I need to make a protocol converter to get an existing device connected to DeviceNet. The converter will only have two connections: one connector to the device for communication and power supply of the device and one connector to DeviceNet. DeviceNet will power both the device and the converter. The converter itself will run at any DeviceNet voltage (11-25V), but the existing device needs a power above 11V (e.g. 15V or 20V). Now there are two solutions: a) both existing product and the converter will be powered by DeviceNet at the same time, but the customer has to supply a sufficient voltage at the DeviceNet (e.g. 22V minimum). b) The customer has to supply an additional voltage, which makes the converter bigger and more expensive, but the complete system will work in worst case. Which solution is preferable/possible at all? Answered by Matt Kuzel, Chairman of the Physical Layer SIG, e-mail: kuzel@voyager.net . A264) Your situation is somewhat common. Other examples are when people want to turn on 24v solenoids or lamps. The answers really depend on the specific application. The choice is really yours. If your customer is OK with restricting power use, then your first choice is OK. There are risks that your customer^s requirements could change, or that you could find another customer with different requirements. I am not really sure how big is too big and how expensive is too expensive, but for low power loads (around 2W or less) boost regulators can be built for under $10 (material - low quantity) that are only a few sq. cm. It is important that you limit the *peak* current on DeviceNet. Keep in mind that a boost regulator will make even the average current demands on DeviceNet higher. Perhaps another choice is that you can power your device as you would when DeviceNet is not used. In chapter 9 ^Physical Layer Requirements^ (page 9-9) an example circuit is given of how to accomplish a bus power detection for an isolated node. In our situation we have very limited circuit board space so I^m wondering if we can do without this bus power detection in our isolated node. The explanation on page 9-8 says that ^a node must not enter a bus off condition whereby requiring user intervention in the event of a transceiver power failure^. Is this possible in an isolated node without bus power detection? Answered by Matt Kuzel, Chairman of the Physical Layer SIG, e-mail: kuzel@voyager.net . A276) Some other possibilities: 1) Move the isolation to the I/O side of the processor. This way the u/P is also powered from the network 2) Pass the power for the transceiver across the isolation barrier and do not power it from the network. 3) Some devices monitor the network before transmitting to determine baud rate (autobaud). Although this feature is not yet defined in the spec, it is not prohibited. By their nature these devices avoid the undesired condition. Autobaud is a little tricky because it is important to prevent error frames from being sent when the data rate is wrong. It would be possible to use switches to set the data rate, but simply not transmit until valid traffic is received. Of course you cannot build an entire network out of this kind of device because then all devices would simply wait. 4) Develop another way to determine if the transceiver is off.


热门招聘
相关主题

官方公众号

智造工程师