What if we could just *draw* the code?
December 7, 2014 11:39 AM   Subscribe

Over the years, there have been many visual programming languages, where code is represented in images. Perhaps the most successful, with estimates of over 200,000 licenses, is the engineering and scientific language LabVIEW (an acronym for Laboratory Virtual Instrument Engineering Workbench). Originally released by National Instruments in 1986 for the Macintosh, LabVIEW features dataflow programming, real-time, embedded, FPGA, multi-platform and even LEGO targets, an LLVM compiler, automated multithreading, and the extensive ecosystem one expects from a nearly 30 year old language.

One of LabVIEW's most distinguishing features, though, is how much people hate it.

For those curious to see how LabVIEW is used, there are numerous video tutorials, and even more advanced training. There are also large sponsored and independent communities of LabVIEW programmers.

Tangential LabVIEW previously.
posted by underflow (74 comments total) 48 users marked this as a favorite
 
Text is the worst metaphor for computer programs, except for all the others.
posted by BungaDunga at 11:44 AM on December 7, 2014 [33 favorites]


Someone tried to get me to use Authorware for something once, I was not having that AT ALL.

(IIRC it ended up being done in Director)
posted by Artw at 11:50 AM on December 7, 2014 [4 favorites]


In Soviet Russia, DRAKON programs you a Buran!
posted by BungaDunga at 11:52 AM on December 7, 2014 [1 favorite]


My whole lab runs on Labview and I'm the only one who doesn't know how to program in it. I can run vi's but I'm not about to learn how to make my own. Life is too short.
posted by mbd1mbd1 at 12:05 PM on December 7, 2014 [1 favorite]


Are these like DAGs for engineers?
posted by MisantropicPainforest at 12:09 PM on December 7, 2014


The problem with visual representation of code is that it's either grossly simplified (and thus not powerful enough to express difficult concepts) or it's ridiculously over-complicated, in which case you're actually losing clarity by using a visual metaphor. Hunting the little "handle" on an object representing a case statement is not an improvement over just typing the damn code.
posted by sonic meat machine at 12:09 PM on December 7, 2014 [13 favorites]


LabVIEW hate is much deserved hate. Did I mention I hate it? I do. It's signal processing code can go die in a fire. It makes the chore of compiling Matlab into executable/callable code palatable. I would rather re-code swathes of LabVIEW logic into native Java or C or C++ code rather than suffer through dealing with it's unpleasantness (will it compile? will it link? will it run without crashing? who can say, it behaves so randomly). Inevitably, someone always has to re-write all of the logic into a better language to make it actually usable in products anyways (performance, and stabilitynever seemed to be it's strong suites). So glad I don't work on projects that use LabVIEW anymore.
posted by combinatorial explosion at 12:13 PM on December 7, 2014 [2 favorites]


Are these like DAGs for engineers?

Conceptually, pretty close, but they're not necessarily acyclic. Data flows are edges, data transformations are nodes.
posted by underflow at 12:15 PM on December 7, 2014


IMHO LabVIEW and NI have been holding back visual dataflow programming for decades by A) being both shitty and expensive (I think it's something like $2000 for a version that allows you to deploy an executable), and B) sitting on dozens of patents on core concepts of visual dataflow programming, which they're happy to enforce at the drop of a hat.
posted by vanar sena at 12:15 PM on December 7, 2014 [11 favorites]


Patents! Driving innovation away!
posted by cthuljew at 12:21 PM on December 7, 2014 [8 favorites]


Text is visual. The real question is: why do you think can improve on the visual conventions of fixed-width text, which have been developed over several hundred years?
posted by phrontist at 12:23 PM on December 7, 2014 [8 favorites]


I used to do a lot in Authorware in my days of writing educational software. It was often quite limiting but there's parts of it I miss and parts which would be very useful to non-programmers now.

I mostly used Authorware's scripting language rather than relying solely on the built in interaction icons, but I liked the flowchart metaphor which could quickly give a visual of how the program would flow from code block to code block. It was also remarkably flexible at times. I used Authorware to create a simple IDE which was used by the educators on our team for templated content, and the output was a seperate Authorware file.

It was also great for teachers. The basics could be learned in an afternoon and then the teacher could generate her own simple CBTs and quizzes.

I think Authorware could still have a valuable place in Education if it could geneate HTML5 apps, as well as Android / iOS standalones. There seems to be a dearth of easy to use visual development tools these days.
posted by honestcoyote at 12:25 PM on December 7, 2014 [2 favorites]


Flowcharts preceded "general purpose" programming languages, being used in the planning of industrial operations and later to convey an algorithm to those tasked with producing the instructions required to run the calculations on a specific machine (this in the days before general purpose languages). See this paper, by Goldstein and Von Neumann in 1947.

So one way of posing the question is not: why haven't we found the right flowchart language, but why have flowcharts been perennially rejected since the very dawn of general purpose computers?
posted by phrontist at 12:32 PM on December 7, 2014 [6 favorites]


phrontist: "Text is visual. The real question is: why do you think can improve on the visual conventions of fixed-width text, which have been developed over several hundred years?"

I think it's more or less self-evident that graphical presentation of abstract structures and data can be a useful tool for understanding and learning. Whether it's useful as a primary mechanism to actually create programs is still up to debate, and I'm happy not to make any predictions on either side yet. If there are clever people out there who are willing to experiment with these ideas, more power to them.

What I am fairly certain about is that some things that we currently call "programming" will cease to be called that as they get subsumed by specialized tools. Just like MAX/MSP allows wiring things together, or spreadsheets allow you to work with grids of numbers, or workflow tools allow you to model and implement processes, this kind of thing has been happening for a while already.
posted by vanar sena at 12:35 PM on December 7, 2014 [1 favorite]


vanar sena: [S]itting on dozens of patents on core concepts of visual dataflow programming, which they're happy to enforce at the drop of a hat.
One of the biggest disappointments in my career is that we haven't gotten the genuinely useful CASE tools that they've been promising us since I was a teenager. If NI's patents are even part of the reason, that makes me even madder at them than I am at Umang Gupta.
posted by ob1quixote at 12:39 PM on December 7, 2014 [3 favorites]


My only experience with LabView is doing LEGO Mindstorm stuff with middle school students. I *hate* it.

Fundamental flow control is only very vaguely indicated (by which I mean it's often entirely unclear exactly when in a loop a variable is accessed or modified.) The nice thing about LabView, in an educational context, is that a middle school robotics classroom is literally a room full of monkeys banging on keyboards and randomly clicking, and LabView can translate those user interactions into code which does something. But, when you get the rare primate that is really motivated to make something very particular happen with a robot, debugging student diagrams/code is really frustrating for everyone.

The funny thing is that, in my experience, young kids actually don't have great spatial reasoning skills. The linear progression of say

10 PRINT 'FUCK'
20 GOTO 10

totally makes sense to kids in a way that a rat's nest of wires and magic boxes doesn't. The problem is that young kids can't type (although middle school kids certainly can.) At least a couple of years ago, the ROM hacks for Mindstorms that allow you to program in 'C'-like code work pretty well... but the IT work to get that working is a bit above the typical middle school student.
posted by ennui.bz at 12:39 PM on December 7, 2014 [5 favorites]


I've not used LabVIEW. I've used/seen a few other visual-programming tools. The two I will call out are both IBM products - one called IBM Integration Bus and one called Node.Red. Well, Node.Red isn't strictly an IBM product.

In looking at some of the videos that talked about LabVIEW, the visual paradigm it uses appears to be much closer to that I've seen used in Tibco software integration development.

One interesting distinction between what I've seen with Tibco - and the bit of LabVIEW - and Integration Bus is that Integration Bus lets you reformat the diagrams, in the same way that a smart text editor for Java or etc. will adjust whitespace for you. Integration Bus will let you lay out the flow of the connected graphs from left to right, or top to bottom, or etc. etc. and it will organize the whole content that way.

It seems like LabVIEW in particular could use a function like that, to help resolve the spaghetti layout problems people are hating on.
posted by jefflowrey at 12:40 PM on December 7, 2014


I think it comes down to different thinking processes.

My old lab ran a microscope image capture software suite called Openlab. Very powerful kit. Came with a module called Automator, I think, that let non-programmer biologists automate the functions of a microscope by building a visual workflow with loops and conditions. They loved it, particularly for deconvolution imaging, where a lot of out-of-focus images get processed into one image with greater resolution than what is normally allowed with visible light frequencies, which would be difficult to do by hand, reliably and consistently. I don't think they would have been as comfortable and productive with a shell script.

Anyway, the point is that packages like LabVIEW are for a different (scientific, non-programmer) audience than IT.
posted by a lungful of dragon at 12:40 PM on December 7, 2014 [1 favorite]


Anyway, the point is that packages like LabVIEW are for a different (scientific, non-programmer) audience than IT.

I think the thing that causes so much specific hate for LabView is that the metaphor for the user-to-program interface that it's implemented gets in the way of the program-to-computer interface. I don't currently do a lot of LabView programming, but for about a year-and-a-half I was my lab's primary LabView person, and the student who that mantle (*cough* burden) has passed to will consult with me sometimes because no one else in the lab has experience with it. About 90% of the time, the thing we're trying to troubleshoot seems absurdly simple from a computing point of view, like a "tell loop A to wait for state B before executing command C to read variable D", but LabView will do something like read variable D before state B has been reached or refuse to read state B because state B executes on a new NI control board that has a completely different read procedure to everything else in the program. Basically, the level of abstraction makes the learning curve easier at the outset (neither I nor my labmate has any formal programming experience, or, for that matter, any formal LabView experience, beyond "learn what you need to fix thing" because that's what grad students do), but the abstraction extends to obfuscate what should be basic tasks for someone proficient in the basic rules of the language.
posted by kagredon at 12:57 PM on December 7, 2014 [2 favorites]


Back in the days when I was still doing work that got me near a lab all of our test machines were controlled and instrumented via LabView. We had one guy who did all the LabView work and he was treated like royalty. The project lived in fear that he'd realize he could do well consulting on his own and leave. Which is exactly what he did. He now owns his own company doing LabView work for companies not willing to invest in someone of their own like him.
posted by tommasz at 1:03 PM on December 7, 2014 [2 favorites]


LabView is really interesting in that it's this weird exception to how computers are normally told what to do. I'm sure if I had to work with it I'd hate it to, but from a safe distance I can simply admire its creative and alternative programming paradigm.

Scratch is another visual programming language that's had an enormous impact. It's explicitly for kids, a system for building animations and games and things. It's really pretty neat, here's a random example that generates Christmas trees.
posted by Nelson at 1:08 PM on December 7, 2014 [2 favorites]


Text is visual. The real question is: why do you think can improve on the visual conventions of fixed-width text, which have been developed over several hundred years?

The conventions of fixed-width type are a fairly recent technological limitation on a far more effective visual mechanism for rendering text that goes back thousands of years (handwriting systems and variable-width typesetting).

We gave programmers a chance to develop something visual and still have to argue against the garbage design sensibility of Jakob Neilsen.
posted by 99_ at 1:08 PM on December 7, 2014 [2 favorites]


In the 3d modeling domain, grasshopper succeeds magnificently as a visual programming environment, probably because its users are already visual and spatial thinkers.

I've found that architecture and design students pick it up much faster than python, for example.
posted by signal at 1:21 PM on December 7, 2014 [3 favorites]


kagredon: "...but LabView will do something like read variable D before state B has been reached..."

I don't know anything about LabView, but I know race conditions can make any kind of multi-threaded programming a nightmare.

The graphical programming tool from my past that I hated most was Allen-Bradley PLC ladder logic. That was especially fun to debug at 2 am when the manufacturing process was at a dead stop and the code had no comments or decent variable names whatsoever. What's OUT_14 supposed to do if IN_3 is triggered? Better figure it out quick. Good times. Shudder.
posted by double block and bleed at 1:24 PM on December 7, 2014 [2 favorites]


To me, it seems like visual programming makes the easy parts of programming somewhat easier, and the hard parts a lot harder. For example, is there a way to do version control in LabView? Can you diff versions of the same code in any meaningful manner? Almost everybody that needs to code something can learn to write a for loop, figuring out what broke a non-trivial piece of code written by somebody else a couple of years ago is the fun part.
posted by Dr Dracator at 1:29 PM on December 7, 2014 [7 favorites]


I would do a lot more flow charting just as a conceptual thing if there was a decent tool to do it. By which I mean something which doesn't expect me to draw all the lines, decide which box types I want, yadda yadda. (I realize I have to do some of that.) But I really want to click on box a and choose "box b, bool true" or whatever and have it connect, and question me if the chart no longer flows if that happens. That would be super handy.

So what windows- or web-based flow tools do people use which aren't just drawing programs?
posted by maxwelton at 1:32 PM on December 7, 2014


double block and bleed: "I don't know anything about LabView, but I know race conditions can make any kind of multi-threaded programming a nightmare."

In fact, this is one of the things that LabVIEW is actually better at than, say, the tools available to a beginner python programmer. Reading data and mangling from three devices at different rates - one serial, one from some weird proprietary instrumentation bus, one from some intermittently sending internet service? A person learning LabVIEW is going to be doing something useful a lot quicker than someone looking at a python console for the first time.

One of the nice things of the visual dataflow style is that it doesn't force implicitly parallel operations to be artificially sequential like most imperative programming languages do, and LabVIEW uses that flexibility fairly well. Of course if sequencing actually needs to be enforced, it can quickly become a pain in the arse.
posted by vanar sena at 1:50 PM on December 7, 2014


underflow: "Conceptually, pretty close, but they're not necessarily acyclic. Data flows are edges, data transformations are nodes.
"

Realistically, acyclicity makes analysis simple enough that it's very tempting to enforce it.
posted by pwnguin at 1:53 PM on December 7, 2014


VFX software is absolutely rife with visual programming like this. It's almost the default way to do anything. Houdini in particular comes to mind because its graphs of nodes are compiled first into a C-like language which you can view and edit, and then via LLVM to machine code. While it works amazingly well 90% of the time, when it comes to looping it feels very strange. For the curious here's a the equivalent of for(i=0; i < 10, i++) { foo+=sin(foo) }, both as nodes and the generated code: screenshot

It would be nice if one could move from code back to a visual layout. I guess that's what the IDA Pro disassembler does, in a sense :)
posted by lcrs at 1:53 PM on December 7, 2014 [2 favorites]


Seconding that Grasshopper in architecture is really powerful, largely because you can mix VB.net(ugh)/C#/Python components into the mix, making it a great mix of graphical & imperative programming.

The problem with visual representation of code is that it's either grossly simplified (and thus not powerful enough to express difficult concepts) or it's ridiculously over-complicated, in which case you're actually losing clarity by using a visual metaphor. Hunting the little "handle" on an object representing a case statement is not an improvement over just typing the damn code.

Actually, the revelation I had recently is that graphical/dataflow programming is more akin to a functional programming map/reduce/filter paradigm. You work with flows of data; functions/components are (often) stateless and don't have any side effects.

As a result, you can do a lot of powerful things with a few comments, be assured that it'll work on different data of the same type, and quickly visualize/debug it (since you can see the data at each step of its processing).

Of course, you can't as easily pass functions/components as arguments, etc, but I think this general mentality still holds.
posted by suedehead at 2:13 PM on December 7, 2014 [2 favorites]


People underestimate how much better we are at linguistic abstraction of relationships as opposed to visual abstractions.

Consider the following:
In each state, there are cities, in each city, there are schools, in each school, there are classrooms, in each classroom where there are one or more teachers and one or more students, each student is required to have their textbook. The textbook is either with the student at their assigned desk, in their locker, at home, or not purchased.
This is not an especially hard statement to parse (if a bit clunky in its programmatic nature). Compare the chart you would have to draw, with the many lines describing relationships. Or maybe you have some visual abstraction mechanism to indicate the same relations are repeated, but it still needs to be learned, and to somehow be conveyed graphically. Now add the following:
Each student has either no crush, a crush, or multiple crushes, on other students in their class, or in other classes.
The human mind can keep track of webs of directional relationships, presented in narrative form, that can't be displayed cleanly in any graphic representation (not to mention the case where you are limited to two dimensions for layout, and straight lines to indicate the relations).
posted by idiopath at 2:14 PM on December 7, 2014 [15 favorites]


suedehead: I used puredata extensively in the past, and am a professional Clojure programmer today. I consider my work with visual dataflow to have been training wheels for functional programming, but I wouldn't use visual dataflow for anything serious. For another small example - in a textual programming language of sufficient power I can describe a new kind of relationship between two elements of the code, at various levels of abstraction, and clearly indicate what kind of relationship is being introduced when a new relation between elements is indicated. In visual dataflow, I have at best lines, thickness, and a color palette, and even in the most advanced hypothetical visual dataflow where you can design custom connectors, it is simply weaker than linguistic abstraction in describing novel kinds of relationships.
posted by idiopath at 2:22 PM on December 7, 2014 [2 favorites]


idiopath, that's a bread-and-butter structural model to specify in the UML class diagram notation. In fact it would probably be more difficult to specify declaratively in python without resorting to some third party embedded DSL.
posted by vanar sena at 2:25 PM on December 7, 2014


One of the growing Visual Scripting languages right now, because of the new Unreal Engine and its price, seems to be the Blueprints System in Unreal Engine.
posted by symbioid at 2:28 PM on December 7, 2014 [5 favorites]


My first brush with LabView was around the time when Windows95 came out. I struggled then (I think I gave up) to introduce a low-pass filter into a data line, and haven't been converted since. Oh, it does the easy stuff. But I have seen PhD students suffer for years, e.g. to merge data with different sampling rates or to expect a card with digital and analogue inputs to actually stream both, and at the end of it not really comprehend the code. NI support could be better, shall we say. But the lovers!
posted by StephenB at 3:05 PM on December 7, 2014 [1 favorite]


I've written test software and robotics software in LabVIEW over the last few years. The time I spend on it varies, but I've probably got the equivalent of at least a year of full time experience using it.

In many ways, I despise it. I've done my absolute best to move all our test automation stuff away from it to something more sane (and text based).

LabVIEW lures the inexperienced in with the promise of an easy time. (My boss fell for this). You place this box, place that box, draw a line and IT WORKS! Wow! You spend a few weeks absolutely loving it, developing ever more complicated software instantly with no debugging. In that way the learning curve is really, really shallow.

It's only after a few weeks, when you realise you have to make a minor architectural change (the kind of thing you could refactor in an hour in a traditional programming language), that it all comes tumbling down. The maintainability of it is absolutely horrible. Wires strewn across the sheet, not a single comment anywhere and you spent so little time developing each section that you can't even remember making it, let alone what it's meant to do. That's when you realise that you basically can't refactor it AT ALL. Ever. Write once, never ever try to change. Try to pick up someone else's software and 9 times out of 10 you'll scrap it completely and rewrite it.

Because of the way things are delegated to SubVIs a VI wiring diagram has the curious effect of making the really, really simple maths look terrifyingly complicated while hiding the awful system calls and library calls behind seemingly innocent boxes.

I think there's a bathtub curve to it. Once you've spent a few years working with it you start to learn the architectural decisions needed to manage complexity. It's not ideal, but by very diligently breaking your code up into sub-blocks, forcing yourself to document everything and following a sane program structure right from the beginning (remember that bit about how you CAN'T EVER REFACTOR), you can just about manage to write relatively complicated software without crying or giving in and starting over 3 or 4 times.

If I'm doing testing and need to quickly automate a few functions of an electronics test instrument, LabVIEW is still my port of call. Anything more and the greater overhead in writing something in a text based language is worth the eventual time saving.

I don't think the concept of visual diagram based programming is inherently a bad one, but I do think that the NI implementation is not great.
posted by leo_r at 3:08 PM on December 7, 2014 [4 favorites]


I haven't used it much, but GNURadio is an interesting environment for building - very domain-specific - solid eingineering applications. It has the pick, place and lace visual components that lets you design and test your radio app as you would in ye olde hardware days, by bashing together prepackaged functional components on a bench with a bunch of signal sources and test equipment, but the code's never very far away. It has what seems to me to be just the right level of unobtrusive help for getting things right by colour hints and providing useful but not fussy UI components, while hiding away the complex (ho ho) DSP maths and hairy radio stuff, if that's what you need to happen. You're not absolved from knowing what you're doing, but the environment doesn't use that as an excuse for having the social skills of a mother bear whose cubs you're eyeing.

I wouldn't call it a visual programming language, because it isn't, but it uses visual programming concepts in a very effective fashion to solve a hard, practical problem.

Visual languages get more useful the smaller the domain of problems they tackle, provided they follow the basic rule of "make everything as simple as possible, but no simpler".
posted by Devonian at 3:08 PM on December 7, 2014 [1 favorite]


For the curious here's a the equivalent of for(i=0; i < 10, i++) { foo+=sin(foo) }, both as nodes and the generated code: screenshot

We really should give generated code a pass here, it wasn't intended for human consumption, and the whole program optimizer should compile it down to a constant anyways.

The visual design is up for critique though. There is a bit of an art to designing visual programming languages. For example, Yahoo Pipes allows inputs to either be constants you type in a text area, or wired in to the circular port right next to the area. This simplifies the diagram tremendously while still allowing the option to design Pipes with user inputs.

But part of it comes down to how hard it is to switch away from older paradigms. Essentially all nodes in your visual programming language should be for loops. Functional python code might look like:

sum ( map (math.sin, range(9) ) )

If you squint, many unix utilities operate under the same looping processors metaphor, and your for loop would be translated as:

seq 9 | numprocess /sin/ | numsum

I think this looks pretty straightforward, and translates into a visual program pretty trivially. Even if you have to add a node for the number 9, it's still a 4 node graph vs 9 nodes.
posted by pwnguin at 3:10 PM on December 7, 2014 [1 favorite]


I saw a demo of some software two weeks ago that reminded me of this. It's called Mulesoft and you use it to talk to all the applications you already have in your data center, plus any hosted applications or other web sites, plus custom actions (like sending email or data transformations).

It has this ridiculous GUI where you grab bits and drag them and double-click them and draw lines to connect them. The down side is, of course, that someone has to program the interface to each of your applications before you can include them in this non-programmer's drag-and-drop wonderland -- but damn, when we stuck with the sample items (a small MySQL database, Salesforce.com, and Gmail.com) we were able to do some really neat stuff in just a few minutes!

And I think the difference is that it never really gets compiled, so you can pop it open and edit it whenever you want.

(What I didn't tell them demo guy was that it reminds me a lot of the "Programming With Hypercard" class that I took in college in 1990.)
posted by wenestvedt at 3:27 PM on December 7, 2014 [2 favorites]


The human mind can keep track of webs of directional relationships, presented in narrative form...

I think this example is attractive, but only because laypersons may *feel* they understand the constraints. If you were to next describe a scenario and ask if were within the narrated constraints or nor, a lot of folks would get that wrong. They'd have to write down the narrative, and be sure to check against constraints, essentially turning the story into pseudocode.

I've used Rational some, but only for UML, not for code gen. I find that there are about 20% of coders who can instantly grasp a UML diagram, where this visual language is completely lost on many.

I guess I don't know what my point is except 'multiple paradigms' seem to be necessary - always the 'code view' and the 'visual view'.
posted by j_curiouser at 4:08 PM on December 7, 2014 [1 favorite]


An extraordinarily successful visual programming environment is ... the spreadsheet. Take extremely simple math functionality, put it in that crazy rows-and-columns format, and suddenly everyone is doing what had previously taken programming (or accounting) expertise. I learned a lot from this history of the (computerized) spreadsheet, published when the idea was only 5 years old (it's 35 now).

Another visual programming technology I have enjoyed is the game, Spacechem. Among the things that make it difficult is that the tools you have to control the timing and sequencing of your operations are pretty limited.

Which reminds me of another "visual programming" technology I have sort of enjoyed but also been glad I don't have to spend more time with, the old McCulloch-Pitts logical neuron formalism. This is another language that (if appropriately extended to have a tape) has the expressive power of a Turing Machine, but is extremely opaque to human inspection. And it has the feature that timing sort of has to be worked out by hand, which is nightmarish. I don't actually know anything about electrical engineering but I imagine designing the first processors sort of resembled working with the MCP model.
posted by grobstein at 4:27 PM on December 7, 2014 [2 favorites]


Spreadsheets aren't really "visual," though. It's declarative (no data flow) programming in a matrix. "This cell is equal to A2 plus A3." There is no concept of looping or real "control flow" in the sense of an imperative or event-loop programming language.
posted by sonic meat machine at 4:31 PM on December 7, 2014


The biggest problem with visual languages is that you can't print out a program and study it away from the computer. This has the effect that code is easy to write but hard to debug.

I'm disappointed that modern programming languages are still designed to be written on what is basically agussied up ASR33 teletype.
Shirley there's some happy medium between graphical and textual code entry.
posted by monotreme at 4:59 PM on December 7, 2014


Is there something like Labview that's not as sciency? Because that's how I visualise the apps/tech/things I want to make, but the Labview site seems to only be geared towards scientists.
posted by divabat at 5:10 PM on December 7, 2014 [1 favorite]


Spreadsheets aren't really "visual," though. It's declarative (no data flow) programming in a matrix. "This cell is equal to A2 plus A3." There is no concept of looping or real "control flow" in the sense of an imperative or event-loop programming language.

Sure, it stretches the metaphor. But Excel is used by tons of people who don't know the first thing about "real" programming. I think it's some combination of the declarative syntax and the visual presentation.

The control flow is totally obscured, but you can still iterate over values, use recursion, etc. I think seeing a column of numbers and applying an operation to all of them may be conceptually easier than a for loop for many people.

(And there's nothing magical about control flow. It's the computation metaphor that most computer programmers find natural -- which is great. But declarative formalisms have the same expressive power, and I see no reason to say they aren't "programming.")
posted by grobstein at 5:26 PM on December 7, 2014 [2 favorites]


I'm not saying they're not programming. SQL is declarative and is very important; functional languages like Haskell are declarative, and are probably The Way of the Future™ for concurrency and distributed computing. However, I don't think Excel is even remotely in the same paradigm as LabView et al. (It is closer to both SQL and Haskell.)

I'm disappointed that modern programming languages are still designed to be written on what is basically agussied up ASR33 teletype. Shirley there's some happy medium between graphical and textual code entry.

I don't really see why there needs to be. Written (human) language has not been replaced by photographs or diagrams, after all; there is value in the form, and there is even more value in it now that we have computers that can be used to automate transformations and modifications of the text. I don't see anything as powerful as regular expressions being built for graphical programming interfaces; nor do I think static analysis tools can ever be as powerful and customizable.

Programming is, at its heart, math. Diagrams can be used in mathematics for their explanatory power and conceptual expressivity, but that doesn't mean that the mathematical language has been replaced by diagrams.
posted by sonic meat machine at 5:38 PM on December 7, 2014 [3 favorites]


We gave programmers a chance to develop something visual and still have to argue against the garbage design sensibility of Jakob Neilsen.

I wish you'd stop. The web was a lot more interesting and useful before the designers took over.
posted by Mars Saxman at 5:44 PM on December 7, 2014 [6 favorites]


LabVIEW isn't the only flowchart oriented programming language, and I would wager it's not even the most popular. Programmable Logic Controllers (plc) are everywhere in industrial settings, and most are programmed in some proprietary version of Ladder Logic, which is derived from relay circuit diagrams.
posted by Popular Ethics at 6:01 PM on December 7, 2014


monotreme: Shirley there's some happy medium between graphical and textual code entry.
Be careful what you wish for…
posted by ob1quixote at 6:02 PM on December 7, 2014 [7 favorites]


Is Befunge a visual programming language?
posted by LogicalDash at 6:05 PM on December 7, 2014 [1 favorite]


Labview is awful for all the reasons listed above but what I really hated about it was the carpal tunnel syndrome it would inevitable give me. All that fussy clicking and dragging wires around really took its tool on me.

That and endlessly searching through submenus of submenus of toolbars to find that one function that godamnit I knew I've used it before......
posted by garethspor at 6:41 PM on December 7, 2014 [1 favorite]


I used to do consultant work in LabVIEW for a couple of small companies and got certified as a developer certification by NI. That's many years and versions ago and I don't know how the language/framework has changed since then. They did introduce native object-orientation shortly after I left LabVIEW behind. Before that, we had to implement that in clumsy third-party packages.

Some of the hate LabVIEW gets is unfounded, I think. It quickly gets hopeless if you aren't strict about coding standards and use some patterns to structure your code. That stuff isn't taught to or learned by most users, and it isn't always obvious how to implement patterns that are described for text-based environments.

That said, I learned to dislike LabVIEW and quit using it in the end, even though I work in the kind of research environment for which it is marketed. I find that it occupies a bad middle ground in the amount of complexity it can handle: It's too rigid to work as a scripting language, e.g. in a lab where the code changes every second run. For complex applications on the other hand, I eventually found myself working against the environment all the time, finding cumbersome ways around its inherent limitations.
posted by Herr Zebrurka at 8:00 PM on December 7, 2014 [1 favorite]


I've been working with control systems and machinery automation for over 25 years now. Most machines are programmed in a graphical language. Here are few examples of other graphical languages I've worked with:
Pilz
DeltaV (one of the worlds most popular control systems)
Think & Do
Ladder Logic
I've also done extensive work with LabVIEW in both industrial and education environments. So, I would argue that if your programming your machine or test system in a non-graphical language you're the odd one here and if a controls engineer ever has to try and make your machine work they will be stripping out your 'text-based' language and putting in things that are meant to run a machine.

As for LabVIEW specifically, I've looked though a few of these 'hate LabVIEW' lists and the complaints seem, in my mind, to fall into two broad categories.
1) You really don't know how to use this language, do you?
2) Fair criticisms of its weaknesses, but all languages have weaknesses that have to be worked around. This is, of course, why new languages keep getting invented.

I don't mean to be all rah-rah about LabVIEW, it is simply one of many machine programming tools that I've used and one of the more popular ones. I'm quite used to it's limitations and how to make it perform reliably.
posted by Confess, Fletch at 8:34 PM on December 7, 2014 [3 favorites]


I once had to operate a wind turbine that had its SCADA system written in LabView. It was ugly and slow, and you sometimes had to deliberately crash it to get access to the file system of the Win2k embedded box it ran on. Blecch.
posted by scruss at 8:46 PM on December 7, 2014


divabat: If This Then That (ifttt) and Yahoo Pipes may be useful tools for what you want to do. Don't be surprised if you end up moving from those to actual programming - they have limitations that you can get around in a textual language, but they make decent first steps I think, and can work just fine for simpler tasks.
posted by idiopath at 9:03 PM on December 7, 2014


Ugh. A lot of my lab's in-house software (we do optical engineering) uses LabView. I will spend my dying breath trying to get us to use Python. On the other hand, I've heard from some EEs that LabView is actually *really* intuitive for doing FPGAs, because there you're actually designing a circuit with something that looks like a circuit. So: visual abstractions for visual systems, and linguistic abstractions for linguistic systems!

Oh, and

Dr Dracator: Can you diff versions of the same code in any meaningful manner?

Supposedly yes, via an NI product called LV Compare. Never used it myself, and apparently it's visual too. Bleh.
posted by Maecenas at 9:44 PM on December 7, 2014


oh my god the compare function in LV is awful because it will mark things like positional moves of objects on the block diagram or which case structure is currently being displayed as "changes" and there's no way to filter those out, so you're left wading through 50 changes, of which 47 are cosmetic, in order to find the one change that broke the program

ask me how I know this
posted by kagredon at 9:47 PM on December 7, 2014 [3 favorites]


I once made a visual language called Texture:
http://yaxu.org/colourful-texture/

Things form a graph automatically, based on proximity and type compatibility, and connections are visualised, although the underlying model is FRP, not dataflow.

It's pretty awful to use, but interesting. Its inspired by the Reactable, which I think is the greatest visual programming language (although they don't call it that).
posted by yaxu at 11:30 PM on December 7, 2014 [8 favorites]


idiopath: I'm familiar with both, but I'm looking more for things that let me make apps or websites with them (so something a little more complex than IFTTT or Pipes).
posted by divabat at 12:26 AM on December 8, 2014


I consider my work with visual dataflow to have been training wheels for functional programming, but I wouldn't use visual dataflow for anything serious.

It depends - I often do a lot of computational 3d modeling / GIS / data analysis stuff, and a visual dataflow language (aka Grasshopper) is great because debugging it is very easy - since it's a flow, you can click on an element within the flow to see the data at that point. It would be a pain to do it by text-based programming. Indeed, it's the reason why many architecture firms around the world use Grasshopper (among other software).

But of course, it's nice to have text-based programming on hand, which is why I continuously think that it's too bad that most people don't know Grasshopper - being able to mix imperative and data flow is incredibly helpful. Plugging in a nice Python script in the midst of a data flow is the perfect best of both worlds. Plus, if the dataflow has subpatches/clusters, then you can encapsulate away groups of components into a function of sorts.
posted by suedehead at 12:37 AM on December 8, 2014


I apparently ran a thought experiment on this overnight (I thought I was sleeping, but, hey,subconscious batch jobs).

I'd like to think that, given a bit of time and decent enough resources, I could code something simple like Tetris in quite a variety of languages from a standing start in a reasonable time. As long as there was a tiny set of IO primitives and a bit of array handling. I wouldn't really begin to know what a visual language would look like that would let me do this, unless it's one that has a lot of specific knowledge for this class of problem baked in. And a lot of words.

I could tell you how to do Tetris in a few lines of text that describe the algorithms completely while being independent of any particular text-based language. How to write down algorithms in pseudocode is absolutely CS101 that can be imparted to a lot of people quite quickly given a base level of intelligence and a certain grasp of abstraction.

How skilled a graphics designer/illustrator would it take to draw the same algorithm, without words, in a way that that would be widely comprehensible? You can't run a civilisation on cave paintings. As hard as you try to represent abstractions in pictures, you'll end up with written language.
posted by Devonian at 2:09 AM on December 8, 2014 [2 favorites]


Listening to all the biologists commenting.....suddenly I'm really glad my lab uses Matlab primarily and that that was what I learned first. I've never investigated Labview at all except for a few month span of time when I was playing around with a Lego Mindstorms robot with the idea of using it for an experiment, but at the time I remember it being really simple but also irritatingly clunky to get the little robotic machine going. With text-based code, I can see much more of what I'm doing at a time, which for me is useful for getting a sense for what I want the whole program to do. Visual interfaces take up a LOT of screen space for the information they convey, and for that reason I'm not a huge fan of them.

On the other hand, my lab also uses a visual-based programming language (RPvdsEx) to run some of our specialized equipment. (Matlab then gets used for analyzing the output from that equipment.) I've given other lab members crash courses on both this visual language and our Matlab programs so that they can figure out how to use it for their own work. I'm usually working with total neophytes to programming, and in my experience it's way easier to walk them through the Matlab programs than the RPvdsEx programs.

There's a bunch of reasons for this--in order to get as much information about the modules in a minimal space, RPvdsEx has a bunch of names for each kind of module that are labeled in shorthand. That makes a lot of sense if you're used to working with this data, but not so much if you're new to it. The visual interface is also much more of a pain to properly comment and annotate, so the Matlab programs tend to be well commented while the RPvdsEx programs tend to be minimally commented. It's surprisingly easy to tangle up the RPvdsEx lines from module to module if you're poking it and you're not very careful. And it's also easier to miss settings in the visual interface that would be stated in a specific place in one of our Matlab programs. Frankly, I vastly prefer the text-based programming languages I dabble in, and that seems to be the view of my department as a whole--R is the hot thing for stats, everyone's encouraged to learn at least some Python, anyone doing any work with genomics needs to know how to work with a bash shell. Is this Labview visual-programming-preferred thing primarily a proteomics thing or something? I've seriously never encountered it before.
posted by sciatrix at 6:40 AM on December 8, 2014 [1 favorite]


Visual programming is one of those things which sounds good until you think about it a bit more. It's sad that people have to use this crap in the real world.

One thing I love and heartily recommend is the exact opposite: PlantUML! Write code, automatically generate many kinds of diagrams. It's quick once you learn the syntax, works perfectly in source control and there are plugins for various platforms/tools (I mainly use it in Eclipse these days).
posted by dickasso at 7:21 AM on December 8, 2014 [3 favorites]


Is this a thread where I can shit all over MatLab, too? Because that thing is awful.

Just the requirement that all text would be forced onto a vertical "grid" of the font x-height would be a massive improvement. As it is, it's quite easy to place an equation 1/20th of a line higher than another, causing unexpected order of operation (Up->down happens before left->right).
posted by IAmBroom at 8:26 AM on December 8, 2014 [1 favorite]


I have used LabVIEW on and off for years - back in the 90s, I was a student programmer and we ran our physics labs on LV with excellent results. Data acquisition, sample changers, ovens, stepper motors and such; after a year or so we could run a dozen samples lights-out over the weekend. Later I used it for streaming data from earthquake engineering.

It's actually quite possible to write robust, modular maintainable code, though as mentioned above it requires a huge change of mindset. I liken it to a lot of hard-concept learning, for example linear algebra. You have to put in the hours and learn a new way of thinking about programming, and I've seen more than one developer fall at the hurdle. It's hard, full stop.

LV sucks for a lot of tasks - string manipulation, heavy math and parallel programming for example. Conversely, if you want to sample a channel at 50kHz, FFT it with a Hamming window, plot to the screen and save to disk, LV is the perfect tool for the job. I like it a lot and wish it were priced for side projects.
posted by phubbard at 8:57 AM on December 8, 2014 [1 favorite]


sciatrix, are you me? I left a lab like that 3 months ago. I think I once had to describe RPvdsEx as the unholy spawn of Matlab Simulink and LabVIEW (with the worst aspects of both! Because screw you.). The only saving grace was that I managed to wrangle to toolkit to get the data exported out into things Matlab proper can actually deal with. At least Matlab makes more sense.

I'm still That Person on call to come in and deal with explaining how things work to new lab members. At least they usually pay me in beer for it.
posted by ultranos at 9:17 AM on December 8, 2014 [1 favorite]


Is this Labview visual-programming-preferred thing primarily a proteomics thing or something?

Neuroscientists make use of it. Microscope control, signal acquisition from patch clamp electrodes, etc.
posted by a lungful of dragon at 9:52 AM on December 8, 2014


I don't think there's anything inherently wrong with (well-designed) visual programming environments as an offering for "non-traditional" programmers. I think a lot of programmers right now are afraid of them - not because they think they will become superfluous but because they think they will be forced to use them. I don't know about "we" but I am much better at verbal/symbolic reasoning than spatial, that's why I'm a programmer in the first place.
posted by atoxyl at 10:51 AM on December 8, 2014


There's been a few comments regarding PLC Ladder Logic.

The metaphor here is that power is supplied at the left side of the screen, and you write logic statements that selectively allow power to flow to "outputs" on the right side of the screen. If all the objects in the line are "ON", the power flows. If anything in the line is "OFF" or "FALSE", the power is interrupted.

Ladder Logic was developed specifically to appeal to electricians who would recognize the style from wiring diagrams and blueprints. As such, it has been quite successful.

To handle Case statements, ladder rungs can have parallel rungs or branches that give you a visual option to flow power, i.e. if This condition OR this condition is On/True, allow power.

Explicit comments and unambiguous variable naming are key to making PLC code comprehensible and traceable. Amazingly enough, Allen-Bradley's PLC 5 generation is nearly 30 years old and still in wide use. Their stuff is almost bullet-proof, although expensive, and the software has many shortcuts and tricks.

LabView is even older. I saw a version demonstrated at an early MacWorld. At the same show, there was a visual relational database called Helix that used computational 'tiles' to program. Helix surprisingly still soldiers on, in a sort of software 'death march', its adherents still waving tattered banners, but that is a story for another time ...
posted by bc_fred at 11:40 AM on December 8, 2014 [1 favorite]


One of the growing Visual Scripting languages right now, because of the new Unreal Engine and its price, seems to be the Blueprints System in Unreal Engine.
posted by symbioid at 5:28 PM


As the resident Kismet* wizard at a recently closed AAA studio, I churned out several thousand nodes worth of visual scripting in Unreal 3 (the old version) every week for about five years.

The Good: visual programming allows you to arrange and select code in ways that are simply not possible with text. The amount of power unlocked with judicious spatial arrangement of logically related code paths and box select (plus additive/subtractive modifiers) + copy & paste is staggering. Reuse of any code block, subset thereof, or arbitrary portions of multiple blocks at any time, for free. For gameplay prototyping a good Kismet scripter can typically turn around feature requests and systems in a quarter the time required for a first-pass C++ implementation. This rarely produces shippable gameplay systems in Unreal 3, but with Unreal 4 that's frequently no longer the case.

The Bad: storage, access, and delta tracking of state is an unmitigated clusterfuck, always, with virtually any visual programming language. Every aspect of the approach will fight it, and the core tech team is left fighting brutal cache misses from everything being essentially global. In the end our tech team wound up writing a dedicated state machine system with a rich grammar and a bi-directional event system for passing calls and arguments to/from the scripting layer.

The Ugly: attempting to debug anything written by designers with no serious programming background "empowered" by the visual paradigm. Attempting to read anything written by other scripters of any skill level until layout/commenting styles are standardized. Attempting to impress upon new hires the importance of adhering to those standards despite the visual scripting clearly being intended for cobbling together quick snippets of high-level behavior.

The Lesson Learned: visual programming is awesome for quick prototyping, but robust state tracking ships games.

*Kismet was the Unreal 3 visual scripting language that was supplanted by Blueprint in Unreal 4. Blueprint (frequently referred to as K2 or Kismet2 in the engine source) essentially extends Kismet into a full, compiled programming language which runs much closer to native code speed, at the cost of mostly-linear execution path. Nearly all game-type templates for Unreal 4 are available as both C++ and pure-Blueprint implementations - it's pretty clear Epic poured several million dollars into this.
posted by Ryvar at 9:25 AM on December 9, 2014 [10 favorites]


The important thing about Labview is make sure to use the bloody error wires to control sequencing, don't just use sequences and so on. And don't use global variables if you don't need to.

I tried telling people to this in the lab where I was the Labview guy. After having written for my first big project a lumbering franken-program that used sequences where I should use error wires, with one big panel of global variables.

Is there something like Labview that's not as sciency? Because that's how I visualise the apps/tech/things I want to make, but the Labview site seems to only be geared towards scientists.

Pure Data. Lots of music and experimental video stuff.
posted by sebastienbailard at 8:09 PM on December 9, 2014 [1 favorite]


phubbard: "LV sucks for a lot of tasks - string manipulation, heavy math and parallel programming for example."

I admittedly have never used LabView, but I'm curious about that last one. The Network of Processors model that LabView uses gets its own chapter in my distributed systems book, because it trivially maps into a parallel processing model. Pretty much the only downside is operations which need the entire dataset, like sort, bottleneck the parallelization. So you either use insert sort, or just buffer the entire thing until EOF and then quicksort.

Does LabView's allowing cycles constraint take this off the table?
posted by pwnguin at 11:25 PM on December 9, 2014


OMG, I (briefly) used helix. To no good end, but it seemed a way to YOUTH me to do some DB stuff. I was wrong, but mostly because YOUTH.
posted by maxwelton at 12:27 AM on December 10, 2014


Matz, the designer of Ruby, published Streem 16 hours ago. While not a visual language, it uses the same data flow paradigm, and one could presumably construct an IDE that allows people to "draw" software using it behind the scenes.
posted by pwnguin at 4:17 PM on December 11, 2014


« Older Wes Anderson at 79°S   |   "because stories breathe here" Newer »


This thread has been archived and is closed to new comments