Browse Category: Events

What To Do About GDevCon

Hello Everyone,

As you can imagine, there is a lot of uncertainty in the GDevCon team right now about how we handle GDevCon and COVID-19. We have been trying to figure out the right path to take with GDevCon. One of the things I think that we could have done more is communicate the options and the approach we’re taking to you all, so you can plan and try and understand what the future holds.

So this post is an attempt just to reset that a little bit, help you understand where we’re coming from, and also solicit any feedback, because we’re making assumptions about what is most useful to you as well. 

A couple of caveats upfront:

  • This post is my interpretation of the discussions in the GDevCon team (with their blessing). Conjecture may not be the same as everyone on the team.
  • No-one has a crystal ball so please don’t make hard plans based on this post!

Responsible Timing

So the first thing to say is we didn’t want to rush any decision. Obviously in March the severity of the situation became very obvious. With the event in September, six months later, we didn’t want to try and predict the future. We want GDevCon to go ahead if it can. We think it’s a very valuable event and quite frankly if it is able to go ahead, it’ll be a nice relief after these lockdown times.

So we made a decision early on to hold off a final decision until June. I don’t think that people are making many travel plans anyway so I expect this delay will have a minimal impact on attendees but is a big benefit to us to see what is happening in the world.

The Options

So what are we considering as the options to decide between:

  1. GDevCon #3 going ahead.
  2. Postponement (probably to early 2021)
  3. Online Event

Going Ahead

To go ahead though I think we will need to be confident that three conditions are in place by September:

  1. International travel needs to be easy. GDevCon is a global event with attendees, sponsors and speakers coming from all over the globe.
  2. CERN needs to be happy that they can host us in a way that everyone is safe.
  3. Companies need to be happy to send their staff in the confidence that everybody will be as safe as possible. There may be a situation where travel is possible but companies are still not allowing it. This is a harder one to define but worth discussing.

As I write these down, to be perfectly honest, I find it hard to see that we will hit one of these points, let alone all three. As countries begin to ease restrictions over the next month though this will probably become clearer.

Postponement

Postponement is probably the most likely option. We would look to postpone the event, perhaps to early 2021. 

We have had a great response from sponsors for GDevCon #3, and the presentations submissions have been fantastic. This takes effort from us and from presenters and sponsors so we would like to avoid throwing that effort away and take the same speakers, sponsors and attendees with us to a later date.

No-one would be obliged though, and full refunds would be available in this case.

The hard part of this decision is when will it be safe?

Online Event

We have had some basic discussions about an online event but are in broad agreement.

To be perfectly honest we’re not keen. Since we started GDevCon we saw the kind of team building and community building aspects to be as important as the content. So that’s why it was a two day event from the start, we wanted that evening event for people to mix and talk and continue conversations. And so we feel that we would prefer to avoid an online event, and focus on getting an in person event going.

 

The Future

The good news is we have been very aware of risk through this whole process and GDevCon 2020 is no exception. We are in a position where we can cancel the event, refund everyone and we will still be around and as strong for 2021.

Feedback

I hope this helps explain where we are and the options we are considering. 

This event is about you though so tell us what you think. Does making a decision earlier help you significantly? Has your company/spouse already ruled out travel in September anyway?

Understanding your position will help us to understand the options in front of us. So either comment below or get in touch with the team via email or social media.

e: admin@gdevcon.com
t: https://twitter.com/GDevConference
l: https://www.linkedin.com/company/gdevcon/

Reflecting on GDevCon #2

GDevCon #2 is wrapped. I’m really proud of the event we put on this year and its got me really excited about where we can take it next year. We’ve got plenty to think about and I hope that we can top it again.


I’ve taken some really interesting thoughts away and I thought it would be interesting to share them (and help me by having to describe them!)


Practical Talks


Our target from the start has been that this has to be valuable to all members of a team. The CLA summits are often a place to pontificate about architecture and design but that is of no interest to someone getting going with LabVIEW or working on an established framework but with engineering challenges and deadlines to meet.


This year I feel we got this much better. I believe that showed in the variety of levels, engaging presentations and useful takeaways.


Pragmatism


On the whole, it felt like there was a more pragmatic approach than other conferences (influenced by DSH workshop’s pragmatic software development workshop?).

This is a combination of confident presenters who were happy to present what they do and accept it may not be perfect but it works for them. This was then largely reflected in comments and questions that respected this. (while still looking to inform when there are other options as well)


This really excites me as I think it makes it a more friendly and open environment and hopefully might encourage more to speak so we get different perspectives on topics.


So think about this in future presentations. It works for you! and try and think about the pros, the cons and your context which all factor into it being a good engineering decision.

“Chill Out” Room

I still love having the streaming room. I get frustrated sitting in one place for too long so it was great to have somewhere to stretch out but not miss the content (as I often have at other conferences). It does cost money! But it seems to be valued by those who use it.


Ready for Next Year!

I hope we can start soon with planning next year so make sure your signed up to the mailing list at gdevcon.com to hear about it.

We will be asking for feedback soon as well so do let us know what works and what doesn’t. My final takeaway is that I can be very wrong about what people will like! I’m happy to admit that I had concerns about a few things this year and how they would be received that were completely wrong. So help me to be right more by telling us what you need!

European CLA Summit 2016

Wow.

I am sat in a hotel lobby after the European CLA summit blown away by the amount of talent coming through in the LabVIEW community.

The CLA summit is an event designed for LabVIEW Architects to come together and share concepts and ideas to continue learning after the end of the NI courses.

It was a huge event this year, with I believe around 140 attendees with a great proportion of new faces.

Some of my personal favourites:

  • James Powell presented by showing changes he would make to the standard QMH template to make it less likely to hit bugs by thinking “What would a subVI do?”. An earlier form of this discussion changed the way I view this pattern and I think that everyone should see it!
  • The G Code Manager – A simple plugin tool that can greatly speed up managing some of the properties of VIs and classes. This is now online here and I am definitely going to be trying it out.
  • LabVIEW Channels – Jeff Kodosky presented these and the more I learn about these the more intrigued I get. Especially when Stephen Loftus-Mercer illustrates a potential new design pattern using them.
  • LabVIEW Containers from Chris Cilino – Similarly I have seen bits of these a few times but something resonated a lot more this time. I saw it more as where it could save me time and will be looking to try these out.

But I have to say there was not one bad presentation. They all gave me ideas from looking at continuous integration in LabVIEW again to thinking about our role as a community and what we need to do to bring it forward.

Naturally, I couldn’t help but get up myself. I presented my unit testing methodology which I have previously written about and a short presentation about a command line tool I’m working on (expect to see something here on that soon!)

Finally, I find some of these things are best summarised by twitter!

 

 

NI Week 2014 Highlights – All Y’All

Part 3. All Y’All

It is so great to meet up with a community of like minded engineers. Many of you I would not have seen for a couple of years and I met many new people (as well as adding faces to avatars).

Talking of community, thanks to Mark Balla you can download videos of many sessions. Thanks again Mark, these are a great asset for many events where people can’t attend. Remind me, and everyone else, to buy you a beer the next time the opportunity arises.

Also Fabiola De La Cueva of Delacor recorded her excellent session on unit testing, if your interested in the concept I recommend a watch.

  I don’t get on twitter very often but I find it great with events like NI week so I will leave you with some of my favourites, consider it a #ff post!

Hope to see you all at another NI Week/NI Days/CLA Summit soon!

NI Week 2014 Highlights – New Releases

Obviously a big part of NI week is getting to see the new releases. Whilst you can get this from the web what I found useful was attending some of the sessions on the new products as many of the R&D teams attend so there aren’t many questions that can go unanswered!

Here are some of the new highlights I saw this year…

LabVIEW 2014

In my previous post I spoke of evolution not revolution. On that theme, the LabVIEW 2014 release was a remarkably understated event at NI Week with few headline new features (though it was great to see the community highlighted again with John Bergmans showing his Labsocket LabVIEW addon in the keynote).

Having had the chance to review the release notes though there are a few that could be of benefit.

  • You can now select an input to the case structure and make that the case selector. This productivity gain will definitely build up, even if its only 20 seconds at a time.
  • New window icons to show the version and bitness of LabVIEW. A minor update but useful for those of us using multiple versions.
  • 64 bit support for Mac and Linux. I think the slow update of 64 bit LabVIEW is almost certainly hampering it’s image as a data processing platform in many fields and this seems like a great commitment to moving it forward.

The others seem like changes you will find as you work in 2014 so let me know in the comments what you like.

What is great is having more stuff rolled up into base packages. I strongly believe there is a software engineering revolution needed in LabVIEW to bring it to the next level so putting these tools into the hands of more users is always good.

LabVIEW Professional now includes Database Connectivity, Desktop Execution Trace Toolkit, Report Generation, Unit Testing and VI Analyzer. LabVIEW FPGA also includes the cloud compile service which gives faster compiles than ever with the latest updates or the compile farm toolkit if you want to keep your data on site.

VI Package Manager

One evening I was lucky enough to attend a happy hour hosted by JKI, who among other achievements, created VI package manager which is by far the easiest way of sharing LabVIEW libraries.

They announced a beta release of VIPM in the browser. This allows you to search, browse and install packages in your browser, promising faster performance than doing the same in the standard application. The bit I think will also be hugely beneficial is bringing in the ability to star your favorite packages. I’m very excited about this as I hope it will make it easier to discover great packages rather than just finding those you are already aware of.

vipm_homepage
You can browse the public respositories and find popular packages

Each package has it's own page and can be installed from here (launches the desktop app)
Each package has it’s own page and can be installed from here (launches the desktop app)

This is live now at vipm.jki.net. Don’t for get to leave any feedback on their ideas exchange, feedback makes moving things like this forward so much easier.

The CompactRIO Revolution Continues!

Two years ago compactRIOs were fun as a developer, not so much if you were new to LabVIEW. They were powerful in the right hands but seriously limited on resources compared to a desktop PC.

A few years ago the Intel i7 version was release which offered huge increases in CPU performance but was big, embedded was a hard word to apply! (That’s not to say it wasn’t appreciated)

Last year the first Linux RT based cRIO was released based on the Xilinx Zync chip, this year it feels like cRIO has made a giant leap forward with the new range.

When you see some of the specs jump like this you can see why as a cRIO geek I am very excited!

cRIO-9025 + cRIO-9118
(Top spec of previous rugged generation)
cRIO-9033
(New Top Dog)
Change
CPU 800MHz PowerPC 1.3GHz Dual Core Intel Atom
CPU Usage on Control BMark 64.1% 10.9%
RAM 512MB 2GB +300%
FPGA Multipliers 64 600 +837%!!
FPGA LUTS 69,120 162,240 +135%

These new controllers are no incremental upgrade, they are a leap forward. My only concern is that it will be easier to make applications fit which is/was a bit of a specialty of mine! The new generation of FPGAs really drives part of this, the same difference is seen on the R-Series and FlexRIO ranges as well.

There is also a removable SD card slot, additional built in I/O and the headline grabber, support for an embedded HMI.

At the session on this we got to see this a bit closer. The good news is that it is using the standard Linux graphic support. This means it should support standard monitors and input devices rather than needing any specialist hardware.

Obviously it is going to have some impact on performance. In the benchmark I linked earlier they suggest you could see a 10% increase in CPU. I’m looking forward to trying this out, you could easily see 50% increase on the old generation just by having graphs on a remote panel so for many applications this seems acceptable.

There is also a KB detailing how to disable the in built GPU. This suggests that there is extra jitter which will become significant at loop rates of >5 kHz, so just keep an eye out for that.

Anyway, that got a little serious, I will be back with a final NI Week highlight later in the week but for now I leave you with the cRIO team:

NI Week 2014 Highlights – Buzzwords Galore

NI Week 2014 is unfortunately over (although it means I do get to return to temperatures I seem to be better built for!). I wanted to share some of my highlights which will hopefully get you as excited as I feel and who knows, even persuade you to come next year! As I started writing, this got longer and longer so for now here is part 1:

1. Buzzwords Galore

This year was certainly prime for buzzword bingo with “Internet of Things” and “Big Data” flying around.

The thing that frustrates me about these is the image they produce of some magic black art that you must pay thousands to get in the club and understand it.

The reality is we don’t wake up one morning and build an internet of things. It is a constant evolution of current technology towards blue sky thinking. As Jim Robinson from Intel said in the Wednesday Keynote,

[The internet of things] is the overnight sensation that’s been 30 years in the making.

The great thing about NI Week is that many of the people making those steps are around and it really makes you feel like progress is happening.

For me it was particularly exciting to have a customer of mine showcasing their work in these areas.

National Grid are working to connect 135 power quality monitors to substations in the UK, built with compactRIO, with the goal to collate this data to ensure the stability of the power grid. In the processed form, we will be capturing >11 Billion processed measurements per year from across the UK and connecting to the monitors live to allow power engineers to keep an eye on grid conditions.

You can see more by watching the video from the keynote.  (Wednesday – The Internet of Things for Jim Robinson and Wednesday – SmartGrid for National Grid)

As for big data, I found this to be somewhat demystified by a great talk from external speaker from Dell Software. Unfortunately I failed to take down his name and I’m pretty certain it isn’t who is listed (if so you need to update your linkedin profile pic!). I took away a few interesting points:

  • Big data is really all about analytics (which by the way has been done for years!).
  • He chose to define “extreme data” as when this processing cannot be done on data at rest in the database. Rather it must be done as the data is captured.
  • There are multiple stages to these analytics, from simple dumping to a database for mining working through more advanced structuring, deriving management dashboards up to neural networks and advanced analytics for decision making. Each step reduces the data and provides more insight.

Next year I have learnt I must take more pictures to make describing sessions easier!

As a result of NI Week 2014, I definitely feel I finally have a better feeling of what these mean to me and am excited that we are all part of this revolution evolution.

For Part 2 I will talk about some of the new products I am excited about…

NI Week 2014!

I am very excited to have made it to Austin, Texas this week for the annual NI Week conference. Hosted by National Instruments it represents a great opportunity to get together with like-minded LabVIEW developers and users of other NI technologies. It also heralds the release of many new products including LabVIEW 2014. As well as this I’m very excited that one of our projects will be on the Wednesday keynote stage.

No doubt I will be posting on here over the next week or so about some of the most exciting things that come out of the conference but in the meantime this represents one of the main times I actually pick up twitter much more so feel free to follow @JamesMc86 and/or @WiresmithTech and if your at the conference feel free to come say hello, I’m best described as the hairy one!

Headshot Jul 2014

Managing Large Data Sets in LabVIEW

Have you ever run out of memory in LabVIEW?

I gave a presentation at the CLD summit last week talking about some of the design considerations and a few ideas for techniques that can help when it comes to high throughput applications. If you click the cog you can also open the speaker notes which elaborate on some of the points.

The problem with long term waveform data storage

This was a key item that I didn’t manage to get to. Currently there are two prevalent techniques I would look at:

Relational Databases

These are your SQL based databases whether MySQL, MS SQL Server or similar.

The idea in these is that all of the data is stored in tables. The columns available are fixed in the design of the database and you add data by filling rows. The relational element comes from the fact that each row has a unique identifier which can be referenced in other tables.

Relational Model

The challenge with waveform data is understanding how this translates to a table.

You could store the entire waveform in a single field as a binary blob but this limits the searchablity (which I think is a word!).

Alternatively you must create a new row for every datapoint and each row would need a timestamp, seriously increasing the storage capacity and reducing the performance of the searches. This is before you get into working out the correct design to get optimum performance.

Datafinder

Datafinder is National Instruments’ solution to this. Datafinder is a file indexer. You store all of the files that you want to make searchable in a common place which Datafinder can index.

Through dataplugins, multiple file types can be supported but they all get translated to the TDMS style structure to make the properties searchable. You then use the toolkit for LabVIEW or DIAdem to mine through the data.

This has some appealing characteristics. It is ridiculously simple to set up compared to a database, just put your files in the right place. This also makes it quite flexible, being able to take data from different sources and still keep it easily searchable.

The main issue with this is the fact that it is file based, if the data is continuous across files and the section your interested in spans files then you have to code around this to load from multiple files.

However I am wondering whether there maybe a new kid on the block that could overcome some of these issues:

NoSQL and MongoDB

So I’m pretty sure every credible writer starts with a wikipedia quote:

NoSQL or Not Only SQL database provides a mechanism for storage and retrieval of data that is modelled in means other than the tabular relations used in relational databases. Motivations for this approach include simplicity of design, horizontal scaling and finer control over availability.

NoSQL – Wikipedia

In short NoSQL is one of those wonderful buzzwords which doesn’t mean anything specific, just different!

I was quite intrigued though by some of the different data models. The one that stands out is the document model used by MongoDB among others.

This means that instead of defining tables you add data as documents. These documents contain fields which can be indexed but it is quite flexible, different documents don’t have to exactly match in structure. At the very least this will match the structure of datafinder very nicely and could be a viable alternative where the file based management is unappealing.

My next step though is to investigate the capabilities of spreading data across documents. Most databases allow you to define database side functions about how to retrieve data. This is typically high performance and allows it to be used from any language that has a database driver. I’m planning to investigate whether this will allow for a structure that can retrieve continuous data out of files and make some of our “big data” challenges go away.

I know @petedunc has made a start on getting LabVIEW to talk to MongoDB. Has anybody out there tried these techniques? I’m open to any pointers. Or are there techniques I’m missing?

UK CLD Summit 2014

For the last two days I have been at the inaugural UK CLD Summit at the NI offices in Newbury (conveniently 15 minutes walk so no excuses not to!).

It was an excellent event where I still meet developers that I haven’t met before and catch up with those I do. My big takeaways were:

  1. Working through an MVC framework idea with a group of developers in the developer jam, inspired by multiple presentations on the subject. You can find the results on bitbucket which are still rough but watch this space, this is a project that I want to continue after the event with some community backing.
  2. I’m convinced about the idea of working with open document formats to reduce dependencies which Steve discusses in his blog post and covered in the Central South User Group.
  3. Monkey’s and Teddy Bears will help my business!
  4. I need to get better at talking to people between these events.

Whats more, it has continued to convince me that as a community we achieve so much more together than apart. Over various discussions the topic came up at how much we work from the ground up where many people have all off the pieces. It must be possible to increase productivity in LabVIEW if the community can bridge some of the gaps or just build on top of the existing offering from NI. I feel some rambling coming which I will save for another post for now!

For now it is great to get fellow LabVIEW geeks together, I look forward to the next time and I think others enjoyed it as much as I did. If you like the sound of it I would highly recommend finding your local user group, or start one if there isn’t one near you, all you need is a few LabVIEW engineers and a room to get started!

I will also post my presentation here in the next few days with some thoughts that I didn’t manage to get to on the day.


By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close