Browse Tag: cRIO

Chrome Wont Load LabVIEW Real Time Web Configuration Pages

I spent a week at the CLA summit last week (more to follow) and got a nasty shock on my return.

On attempting to login to configure a compactRIO system I get faced with a request to install silverlight:

Install Silverlight

 

As I have used this every day for a long time, I somehow don’t believe this.

It turns out that Google have removed support for some cryptic NPAPI that many plugins require to run, including Silverlight.

This finally explains why I have been seeing more security warnings about running silverlight pages, it appears this has been coming for some time, in fact over a year! Wish someone had told me.

Fear not, there is a workaround, but it will only work until September.

  1. Navigate to chrome://flags/#enable-npapi in the browser
  2. Click Enable under Enable NPAPIenable_npapi
  3. A box will appear at the bottom of the screen with a Relaunch Now button. Press this to relaunch chome ensuring you back up any work in the browser first
  4. And back to work fixing compactRIOs.

And in September…

I guess if there is no change from the NI side it’s a move to Firefox or IE!

NI Week 2014 Highlights – New Releases

Obviously a big part of NI week is getting to see the new releases. Whilst you can get this from the web what I found useful was attending some of the sessions on the new products as many of the R&D teams attend so there aren’t many questions that can go unanswered!

Here are some of the new highlights I saw this year…

LabVIEW 2014

In my previous post I spoke of evolution not revolution. On that theme, the LabVIEW 2014 release was a remarkably understated event at NI Week with few headline new features (though it was great to see the community highlighted again with John Bergmans showing his Labsocket LabVIEW addon in the keynote).

Having had the chance to review the release notes though there are a few that could be of benefit.

  • You can now select an input to the case structure and make that the case selector. This productivity gain will definitely build up, even if its only 20 seconds at a time.
  • New window icons to show the version and bitness of LabVIEW. A minor update but useful for those of us using multiple versions.
  • 64 bit support for Mac and Linux. I think the slow update of 64 bit LabVIEW is almost certainly hampering it’s image as a data processing platform in many fields and this seems like a great commitment to moving it forward.

The others seem like changes you will find as you work in 2014 so let me know in the comments what you like.

What is great is having more stuff rolled up into base packages. I strongly believe there is a software engineering revolution needed in LabVIEW to bring it to the next level so putting these tools into the hands of more users is always good.

LabVIEW Professional now includes Database Connectivity, Desktop Execution Trace Toolkit, Report Generation, Unit Testing and VI Analyzer. LabVIEW FPGA also includes the cloud compile service which gives faster compiles than ever with the latest updates or the compile farm toolkit if you want to keep your data on site.

VI Package Manager

One evening I was lucky enough to attend a happy hour hosted by JKI, who among other achievements, created VI package manager which is by far the easiest way of sharing LabVIEW libraries.

They announced a beta release of VIPM in the browser. This allows you to search, browse and install packages in your browser, promising faster performance than doing the same in the standard application. The bit I think will also be hugely beneficial is bringing in the ability to star your favorite packages. I’m very excited about this as I hope it will make it easier to discover great packages rather than just finding those you are already aware of.

vipm_homepage
You can browse the public respositories and find popular packages
Each package has it's own page and can be installed from here (launches the desktop app)
Each package has it’s own page and can be installed from here (launches the desktop app)

This is live now at vipm.jki.net. Don’t for get to leave any feedback on their ideas exchange, feedback makes moving things like this forward so much easier.

The CompactRIO Revolution Continues!

Two years ago compactRIOs were fun as a developer, not so much if you were new to LabVIEW. They were powerful in the right hands but seriously limited on resources compared to a desktop PC.

A few years ago the Intel i7 version was release which offered huge increases in CPU performance but was big, embedded was a hard word to apply! (That’s not to say it wasn’t appreciated)

Last year the first Linux RT based cRIO was released based on the Xilinx Zync chip, this year it feels like cRIO has made a giant leap forward with the new range.

When you see some of the specs jump like this you can see why as a cRIO geek I am very excited!

cRIO-9025 + cRIO-9118
(Top spec of previous rugged generation)
cRIO-9033
(New Top Dog)
Change
CPU 800MHz PowerPC 1.3GHz Dual Core Intel Atom
CPU Usage on Control BMark 64.1% 10.9%
RAM 512MB 2GB +300%
FPGA Multipliers 64 600 +837%!!
FPGA LUTS 69,120 162,240 +135%

These new controllers are no incremental upgrade, they are a leap forward. My only concern is that it will be easier to make applications fit which is/was a bit of a specialty of mine! The new generation of FPGAs really drives part of this, the same difference is seen on the R-Series and FlexRIO ranges as well.

There is also a removable SD card slot, additional built in I/O and the headline grabber, support for an embedded HMI.

At the session on this we got to see this a bit closer. The good news is that it is using the standard Linux graphic support. This means it should support standard monitors and input devices rather than needing any specialist hardware.

Obviously it is going to have some impact on performance. In the benchmark I linked earlier they suggest you could see a 10% increase in CPU. I’m looking forward to trying this out, you could easily see 50% increase on the old generation just by having graphs on a remote panel so for many applications this seems acceptable.

There is also a KB detailing how to disable the in built GPU. This suggests that there is extra jitter which will become significant at loop rates of >5 kHz, so just keep an eye out for that.

Anyway, that got a little serious, I will be back with a final NI Week highlight later in the week but for now I leave you with the cRIO team:

What to Expect for the CLED Exam

student_hat_2Last week I got my student head on again to take the Certified LabVIEW Embedded Systems Developer (CLED) exams and want to let you know what to expect if you are thinking about taking the exam, or if you are wondering if it is for you?

The What and the Why

NI introduced the CLED certification last year as a pilot exam in the US and is now available in Europe (and possibly other regions). It’s a 2 part exam to test your knowledge of using RT and FPGA technologies with a massive emphasis on compactRIO embedded applications.

As a self-proclaimed LabVIEW embedded developer I am very happy this has been introduced (as I can remove the “self-proclaimed”!).

There are a huge number of developers using LabVIEW under Windows for test and measurement applications who are very skilled and may well be CLD or CLA, but this doesn’t mean too much when it comes to RT and FPGA. If you don’t understand the differences and subtleties of these then you can get yourself in a mess.

The CLED allows embedded LabVIEW developers to differentiate themselves and differentiation means your next pay rise/promotion/job.

So what can you expect?

The Pre-Requisites

Before you even open the prep kit make sure you have the pre-requisites which seem to be more involved on this exam than may, as I will explain though don’t be frustrated, if you don’t have these pre-requisites you may well waste your time going through the process to find you aren’t ready yet.

From the NI site:

Prerequisites
– Valid Certified LabVIEW Developer (CLD) or Certified LabVIEW Architect (CLA) certification
– Completion of LabVIEW Real-Time 2: Architecting Embedded Systems or RIO Integrator’s Training course
– Completion of the LabVIEW FPGA training course 

Recommended Experience Level
– 18 to 24 months of experience in developing medium- to large-scale LabVIEW control and monitoring applications with NI CompactRIO, NI Single-Board RIO, and/or NI R Series hardware 

If you don’t meet these then there is your first task!

Part 1: Multiple Choice

Your first test is a 1hr multiple choice exam à la CLAD. You must pass this with a 70% result before attempting part 2.

It tests the theory aspects of developing for RT and FPGA on compactRIO all the way from deciding how to configure the hardware to how loop priorities work. I did find it a little more conceptual than the CLAD, less “what is this function” and less triple negatives to decode the answer.

Because of my time at NI I didn’t find this element too stressful, having taught the courses for a few years you get familiar with the nuances of these things but there are some resources I found very useful:

  • CLED Practice Exam: It goes without saying that the sample exam is useful. Also there is a link to a video of a prep presentation at NI Week last year.
  • cRIO Developers Guide: I would suggest this is probably the bible for part 1. I think that probably every question could be answered in here (though I haven’t checked)

I think the key challenge here is that you must make sure you have broad knowledge on cRIO. For example there are questions in the sample exam on selecting whether to run FPGA mode or scan mode (or hybrid) which I expect many developers may not have come across too often. There are also questions on other more niche topics such as FPGA optimisation, advanced shared variable settings and there will certainly be questions around functions that cause or prevent dynamic memory allocations.

Part 2: Practical Exam

Yep, another 4 hour practical exam but somehow it is different again!

I must admit I was a little dismissive of this initially. Having done the CLD and CLA, as well as training many people on how to take the CLD exam through CLD prep days I was fairly confident I knew how to tackle these exams, this lasted up until about an hour into my practice exam when I realised the emphasis changes again.

The practical exam is an application development problem which you deploy onto a single-board RIO which is connected to a simulator. You must write the host code (front panel provided), RT and FPGA according to the requirements provided, these are more similar in scope to the CLD rather than being as heavy as the CLA requirements. The clue to the key difference is the marking scheme:

  • 50% Functionality
  • 30% Design
  • 15% Style
  • 5% Documentation

Compared to the other exams functionality is massively more important, you could have rubbish style and documentation (although I hope you wouldn’t at this stage) and still get 80% on the exam.

My Experience

This was my first error, in my practice exam I was working to get my style to the level that I would for the CLD exam but there simply isn’t time.

In fact time is incredibly tight, I think with another 30-60 mins I may have had time to get the functionality done and tested. From looking at the sample solution as well the attitude I took towards it is to treat it as a well designed prototype, what I mean by this is things like:

  • Design is important. Those 30% should be the easy ones to pick up as even just starting functionality in the right places with the right communication methods should win marks quickly.
  • Style is less important. With 4 hours to create an end to end solution something has to give and this is it. Don’t go back to pre-CLD, I still used non-default icons, modularised and commented where appropriate but there isn’t time to design nice libraries and nitpick.
  • Take every requirement short cut. You can see this in the sample exam solution, if they don’t explicitly ask for it, use whatever route makes your life easier. Obviously in real life there is some value in predicting what a user might ask for next and account for it, that will never happen with the exam!

I think there are probably two different approaches to the exam:

The Horizontal Approach

horizontal approach CLED

This would mean completing the FPGA functionality, then building the RT on top of that and then the host.

There is some advantage that then you don’t have to switch context too often which may be less efficient. You can also get your FPGA compile out of the way although I found mine to be <10 minutes anyway.

It also means you hit some of the key aspects earlier, I think every exam is probably going to need a watchdog, maybe safe states, recovery ability and this is all going to be lower level functionality which you can get out of the way first.

My concern with this approach was that it could depend how it is marked. Technically many features need functionality at every level which may make this more risky on time. You could probably mitigate this by laying out your design first (and getting those design points) so even if you don’t get the UI bit done it is clear how you would.

The Vertical Approach

vertical approach CLED

This is what I opted for. I took each section of the requirements document and implemented the full requirement from top to bottom before moving the next section.

This has the advantage of being able to fully tick off requirements earlier, it is easier to test that it is working one step at a time and I just find it a way of coding that builds confidence as you move forward, but don’t think this worked flawlessly!

The first problem is those design points. I laid out much of the design to start anyway so that those points are done, this also helps move this approach forward so you’re not having to make higher level design decisions as you go. I do wish I had jotted these down so that I can put the paper in the envelope as well to show the design (presuming these get looked at).

The main issue with this is deciding how to prioritise the features. I had a success with this and a failure.

The success was one feature that was very easy to implement and fitted my design well. Because of this I set this at a low priority and never implemented it. The reason I am happy with this was that I hope the design was obvious, I documented the changes needed for it and freed up my time for other tasks where this would have been harder. Hopefully I still got some marks for my intentions though.

The failure was some of the core features such as my watchdog. There was one chunk of the application that worked on start to finish (which too 1-1.5hrs so a big piece). This didn’t leave me much time when I moved onto the fault handling including the watchdog. I got it in but only just and not well tested. I should have split the larger task, again documenting where advanced features would go and moved onto the more core watchdog functionality.

Did I Pass?

I don’t know. Obviously I passed part 1 to attempt part 2, now I’m in the waiting game for the results. If I had to guess I think I would put my chances for passing at about 75% I reckon but lets see. I was much happier with the real exam but I definitely ran out of time which will affect my functionality marks and could have done with making the design decisions more obvious with documentation. We will see!

If you’re attempting the exam, good luck and I hope this was useful. Once you have tried it or got your results please comment below on how you did and any more tips you may have, it would be great to here what approaches other people took.

 


By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close