Category Archives: Interviews

The (new) VIS 2013 Industry and Government Track

It always sounds weird to me when I have to explain how the  VIS Conference (formerly VisWeek) is not only for academics. Until now, I always had to use a number of convoluted arguments to explain it but now I have one more tool in my swiss army knife: the industry and government track. What does it mean? Well, it basically means that if you want to show your work at the conference you have a very specific track for industry and government work.

I think this is very useful for people who are not in academia and important to increase the dialogue between researchers and practitioners, so I decided to ask a few questions to the track chairs Danyel Fisher, David Gotz, and Bill Wright. Especially, I wanted them to explain how it works and why you should participate. And also how to convince your boss!

[Note: the deadline is very very close: June 27. I apologize for posting this that late but I hope you may still submit something. Also, from time to time submission deadlines get postponed, so keep an eye on it!]

The new deadline is July 3rd, 2013.

What is the Industry and Government Track?
The Industry and Government Track is ideal for people who work with visualization in their day-to-day jobs: whether building a visualization dashboard for a business intelligence application, or putting together charts and graphs to explain their products or policies to customers, or just exploring their own data. We want to help them participate, learn from, and teach the IEEE VIS community.

The Track consists of several components: a poster session for practitioners; a panel of invited talks from industrial visualization-users; and a series of other events through the conference—workshops, tutorials, panels, and papers—that are likely to be particularly interesting to practitioners. In addition, companies that use or create visualizations are invited to exhibit in our exhibition hall; a discount “startup” package helps small companies get exhibition space for a song.

Why this new track at VIS 2013?
The VIS conference has traditionally been an academically-oriented conference, with some of the most innovative work in information visualization and visual analytics. But it can also be a little bit insular: many of the cool visualizations don’t make it outside our community, and we aren’t necessarily aware of the challenges that drive the outside world.

That’s certainly not always true, of course: the VAST conference does a great job of watching how data analytics works in the real world; tools like Many Eyes and D3.js have made a substantial impact; and the conference attracts attendees from Microsoft, IBM, and Google, as well as a variety of government agencies and smaller companies. However, VIS 2013 would like to increase the amount of mixing between these communities. We believe that sharing ideas and building connections across these artificial boundaries would be beneficial to . We think that it is a great time for us to share our research with a broader audience—and we’d like to learn from the outside, too.

Who should submit to it and why? What can one get out of presenting a poster at VIS 2013?
Anyone who has solved an interesting problem with visualization—in the way they share it, or show it, or the angle they take on the data—is welcome to submit a poster. So is anyone who is working their way through a broad problem. A detailed Call for Participation with submission instructions can be found at on the industry and government track web page. The deadline for submissions is June 27, 2013.

Posters are a great way to share your work with the conference attendees. Posters will be displayed in a prominent location for several days at the conference, right next to the research and student posters that have been a traditional part of the VIS conference over the years. Attendees can browse the posters throughout the conference, and a formal poster reception gives brings large audiences to the poster gallery. In addition, there will be poster “fast-forward” held during a general session in front of all attendees. During the fast-forward, poster presenters can speak briefly about the main idea behind their poster. Finally, poster abstracts will appear in the published conference proceedings.

With all of these events, presenting a poster should give participants a context to more easily meet other interested conference-goers, and will get them broad exposure to the community. And of course posters aren’t the only reason to participate. We also encourage folks to join in the many other IEEE VIS events throughout the week.

Ok sounds great but … how do I convince my boss to fund my trip?
The best way to change someone’s mind about funding your trip is to focus on the business value that you’ll get from attending. And we hope that you’ll find it to be pretty easy argument to make.

For experienced visualization practitioners, attending IEEE VIS can connect you with new contacts from around the world. Leading experts from industry, government, and academic research centers all descend on IEEE VIS to talk about visualization, foster collaboration, and learn from each other. The variety of expertise that gathers from around the world makes VIS a great networking opportunity. But it isn’t just experts. VIS can also be a fantastic place to recruit fresh new talent as many of the top visualization students from around the world come to showcase their latest projects. And speaking of projects, VIS is a great place to learn about new developments in the field. Research talks showcase the latest work emerging from labs across the world. Learning from these presentations can help you keep your projects fresh and cutting edge as the field continues to evolve.

For those who are newer to the field, an added benefit is the opportunity to learn from experts from a huge range of backgrounds. Formal tutorials and workshops are a key part of the program and offer lessons or discussions on specific focused topics. Panel discussions are another great places to gain insights by listening to leading researchers or practitioners share their insights. The program includes several social functions and coffee breaks where you can borrow the ear of experienced visualization researchers and practitioners to gain insights into the problems you are facing in your own work.

And finally, if you are like many IEEE VIS attendees, you’ll come away inspired and overflowing with new ideas to bring back home. So tell your boss about all of the great things you’ll learn, the contacts you’ll make, the skills you’ll develop, and the energy and innovation that you’ll bring back home with you after the conference. That sounds like a winning argument to me!

Thanks Danyel, David, and Bill! I hope you’ll get fantastic submissions.


Tools from the Pros #3: Jan Willem Tulp on D3 and Protovis

Jan Willem Tulp When I saw for the first time a visualization developed by Jan, the Ghost Counties, I was totally fascinated. It’s brilliant. It took me a while to understand how it works, but once I got it I could not help but admiring the strange mix of complexity and simplicity it provides.

Despite he looks so serious in this picture on the left, he has a big smile and he is fun. I met him for the first time at Visualizing Europe and since then we exchanged many emails. Plus, he is a regular commenter here (and everywhere) and I love him for that.

I don’t know how much I have to add to convince you his advice is a valuable one. Just give a look to his portfolio and judge yourself. He is IMO one of the most interesting data visualization freelancers recently appeared on the scene.

I know, by talking with him, he is proficient with several technologies but he has a passion for D3.

How did you start using Protovis/D3?

I’ve always been someone interested in the latest technologies. So, since I follow the data viz community very closely, I was aware of Protovis very early on, and I was aware of the development of D3 even before it was released to the public. I have a software development background, so I don’t have too much trouble finding my way in new programming languages, and since it excites me to work with new technologies and frameworks, I just started playing with Protovis and D3 as soon as it became available.

What’s the best and worst aspect of Protovis/D3?

The best aspect of Protovis is that it is a domain specific declarative language, which means that is fairly easy to start writing code, using visualization related keywords and functions. The best aspects of D3 is it’s flexibility (more direct integration with SVG) and better performance. The worst aspect of both D3 and Protovis is that it’s hard or impossible to get it working on older browsers, and the learning curve for D3 may be somewhat harder than for Protovis.

Ok, I am a beginner and I want to learn Protovis/D3, where do I start?

I think Protovis is easier to start with than D3, but Protovis is no longer under development. I also see that the Protovis mailing list is not very active anymore, while the D3 mailing list is very active. But, I guess Protovis would be a very good way to start if you’re a beginner. Basic Javascript programming skills would be recommended, both for Protovis and D3. For some great Protovis tutorials you should check out Jerome’s blog: (part 1 – 5: working with data in protovis). Also, the Protovis website has quite some examples that are pretty good. For D3, documentation, examples and tutorials are still under development: so, with D3 you’re more on your own right now (and of course the mailing list). But things are improving rapidly.

How is the learning curve vs. return-on-investment of Protovis/D3?

Protovis is really a good way to learn visualization and programming at the same time. Protovis is a language that is really geared towards the visualization (and diagrams) domain, so it really makes sense to talk about axis, marks, lines, bars, pies, etc. Also, there are really quite some good examples on the Protovis website, so it’s fairly easy to get started. However, right now Protovis is not supported anymore, and people are really moving to D3 now, so getting support may become a little tricky. Also, Protovis does not perform as well as D3 with very complex graphs for instance, and also, compared to D3, in Protovis you’re a little bit limited to the animations you can achieve. So, overall, a good way to start and also good to make some nice standard diagrams and visualizations, but if you really want to do ‘heavy’ visualization stuff, you might consider moving on to D3.

D3 is more powerful, more flexible and seems to have more capabilities (and better performance) than Protovis. The flip side is that in order to have a more flexible programming language, the language is also more abtract. Though many concepts of Protovis are also implemented in D3, and there are also quite some predefined visualization layouts, it’s also more useful (compared to Protovis) to gain some knowledge of SVG, since it’s more likely that you might do some low level stuff. D3 does give you much better animation capabilities, better performance, more flexibility, so, once you get the hang of D3 and some SVG, you’re able to create some very compelling interactive visualizations.

What other tools would you recommend other than Protovis/D3?

The tools you mention are some of the best right now. I also think that Raphaël is a fairly good alternative if you want to do Javascript-based visualization that works in older browsers as well. Personally I don’t have much experience with Raphaël yet. Also Processing now has an Android mode, which is great if you want to create visualizations that run on Android phones, and the upcoming Processing 2.0 also has a Javascript mode, so you can easily create HTML5 canvas based visualizations with the Processing development tool.

A recommendation I’d like to add: when I work on Protovis or D3 visualizations, I use TextMate on the Mac. This allows you to open a preview window which renders your visualization near real-time when you are typing in your code. I’m sure that are similar tools that do this. This is really great for getting immediate feedback while you’re coding.

Tools from the Pros #2: Joe Mako on Tableau

Ok guys, here we are with a new interview of Tools from the Pros, the series in which I interview data visualization professionals about their favorite tools.  This time we have Joe Mako talking about his experience with Tableau.Before I start telling anything about Joe, let me tell you how I ended up  interviewing him. I was looking for an expert to interview with proven experience in designing advanced visualizations with Tableau, so I decided to ask to some twitter friends. Result? Lots of names but only one always there: Joe Mako. If this is not enough give a look to the impressive list of video tutorialshe has in his blog.Joe is employed at S2 Statistical Solutions where he does data integration and visualization. This is what Joe wrote when I asked him to send me a short bio:

I have used Tableau extensively since 2008, creating interactive viewpoints of data to enable people to get answers to their complex questions easily. Currently, I specialize in integrating complex databases from health insurance companies, hospital networks, and the government to enable better evidence-based decision making. I am active on the Tableau user forum, solving a variety of situations for many Tableau users ranging in skill from beginning to advanced.

I really enjoyed reading his interview. He provides lot of interesting references and links. If you are thinking about using Tableau I am sure his tips will help you a lot with your final decision.

How did you start using Tableau?

About three years ago in 2008, I had been reading FlowingData for a few months when I noticed Tableau was a sponsor and decided to check out their software to see if it could help with some projects I was working on. I felt like I was decent with formulas and VBA in Excel, but always had trouble making a decent chart. When I first saw Tableau in action, I knew it would make my job of making sense out of numbers easier because a good chart was easy to make. The first big project I used Tableau on was reporting on data quality and monitoring the cleanup of the records. With the guided analytics Tableau enables, I was able to make interactive dashboards allowing a view to see what records were wrong, why we knew they were wrong, how much revenue was lost because of the error, and then tracking what records got fixed and the increase in revenue. The project was a success, and I knew creating visualizations in Tableau was my passion. In the past three years, I’ve rarely gone a day without using their software, being a part of the Tableau user community has become a big part of my life. The many great people I have meet, and the friendships I have gained by participating in the community are most valuable.

What’s the best and worst aspect of Tableau?

The Tableau Data Engine is the single most valuable feature I would miss the most if Tableau was removed. I don’t know if there is a specific term that can fully describe it, because it is unique, and it has a long list of benefits: bulk text loading, super fast aggregations, incremental appends, and just all around seamless experience. Suffice to say, if I am working with data, and I don’t need a real-time feed, I’m loading it into the Tableau Data Engine (TDE) every time. It has been called an “in-memory” database, but that may not be the most accurate term for it because it is not like other “in-memory” databases. Instead of loading the entire data set into RAM, the TDE intelligently selects what data to load into RAM, so that we can work with data sets larger than our available RAM. So I am not sure if there is a good way to compare it to other data storage systems other then knowing that the TDE was created specifically to work with Tableau, and it is a beautiful thing.

Tableau is a focused and opinionated piece of software, meaning it is not a complete solution, but for what it does enable, it does great job. The number one thing I believe is lacking from Tableau is easy to use and fast statistical functions. Currently, with custom table calculations, and data preparation, I have found that Tableau can compute nearly any calculation, but it is too much of a work-around to force the software to do something it was not designed for, because it adds unnecessary complexity, and commonly makes the interaction slow. There is already a built-in delay with published workbooks (waiting for the Server generated images to download), and the additional delay of waiting for the computations to be evaluated becomes a major drawback.

Ok, I am a beginner and I want to learn Tableau, where do I start?

Tableau provides phenomenal training resources for free. Their On-Demand training and Live Online would be the first place to check out. There is no shortage of interesting workbooks that you can download and inspect or try to re-create from places like the Visual Gallery, their Blog, and there are live workbooks embedded throughout their website (I don’t think I’ve found them all yet). Their Knowledge Base with over 300 step-by-step guides on how to accomplish useful tasks. Then with the Q&A Forum there is no shortage of interesting situations and people like myself eager to help you accomplish what you want in Tableau.

How is the learning curve vs. return-on-investment of Tableau?

I remember learning Tableau was real change in my approach to data, and I still feel like I learn something new about Tableau every day. My experience in learning Tableau has been like playing a game, the first things are easy, and some really amazing analysis can be created with just the use of the mouse. I think of it like “The Princess Rescuing Application“, specifically slide 16, where it is a series of short learning leaps, and each one brings joy with accomplishment.

While Tableau on the surface has a clean interface, many complex operations are just under the surface, a click away, and once you know how to do something in Tableau, it becomes simple and fast to perform. The main exception is custom table calculations where there are a multitude of non-obvious factors effecting their evaluation. I believe an understanding of SQL would make Tableau more understandable and less mysterious. If you need to make sense of numbers, the return-on-investment is easy to see, things that take hours, or require programming, take minutes and drag-and-drop inside of Tableau. I consider it having a conversation with my data when I use Tableau, because as quickly as I or the person next to me can ask the question, Tableau enables me to provide the answer.

What other tools would you recommend other than Tableau?

Once you understand Tableau’s approach to data, I am sure it will be clear that Tableau does not stand alone as a complete data solution. While Tableau is fantastic at the human-centric tasks, it does not perform tech-centric tasks (see “BI Has Hit the Wall” by Stephen Few), and you will need software to help you prepare your data for Tableau. Every few months I am changing my tech-centric applications as my needs change and I try new things, but I think Pentaho Data Integration (Kettle) is wonderful for ETL. There are many ETL applications out there, and I recommend trying them all to find the ones that fit your style and needs best.

Tools from the Pros #1: Miriah Meyer on Processing

miriahI am really excited to announce my first interview for the “Tools from the Pros” series! We start with a very good one: Miriah Meyer talks about Processing.

Miriah is assistant professor at University of Utah. I met her only briefly during a couple of conferences but I am a huge fan of her research work on interactive visualization systems for biological data analysis (be sure to check them out!) Her tools are a rare example of well-crafted design studies in interactive data visualization and, as far as I understand, they are all developed in Processing.

I really like this interview because it covers many of the things beginners (and more advanced users) need to know. One above all: the rapid prototyping approach Processing makes possible and the whole mindset behind copying and pasting code to explore alternative designs.

Thanks Miriah! I think people has a lot to gain by reading this interview.

How did you start using Processing?

I started using Processing in 2008 when I helped design a new undergraduate visualization course at Harvard. We chose Processing as the language for the course, and I learned the core bits of the language putting together homework assignments. I quickly came to appreciate how Processing got rid of all the annoying parts of graphics programming — setting up a rendering window, registering callback functions, dealing with linking and libraries and compiling to multiple platforms, that ridiculous gluPickMatrix, and not to mention the headache of type.

We had Ben Fry come to the class to give a guest lecture that spring, after which we went out to lunch. I’ll never forget his answer to my question of why he created Processing. He said (well, I’m paraphrasing here) that he wanted a sandbox to play in, to quickly develop prototypes without getting bogged down in the architecture of the code. He emphasized Processing as a language to try different designs, with real data and with real interaction. And that cutting and pasting code in Processing is totally cool if it gets a design up and going faster. He wanted a language that lets people totally focus on the visualization concept and design without having to think too hard about the code underneath.

Well, that sounded great to me. And I quickly became a total convert, cutting and pasting code until things got so messy that I had to just rewrite an entire project. I found this philosophy totally liberating and that my work benefited immensely from rapid prototyping. Processing is a language that supports this style of development.

What’s the best and worst aspect of Processing?

In short, the best aspect of Processing is the amount of code it takes to get a simple scene with callbacks going — it is a small fraction of what it would take with OpenGL. Simple primitives like circles, squares, text, etc. are nicely abstracted into one-line function calls. Mouse and keyboard callbacks are automatically handled. There is a wide variety of common graphics helper functions available, like lerp-ing colors. Full-screen apps work without having to grab weird OS handles. The PDF library that exports the current scene as vector-graphics has forever changed figures for papers for me. And the ability to export an application to a variety of operating systems in a single go is absolutely invaluable when working with users on a variety of platforms.

Despite all the simplification of the underlying graphics library, Processing still feels like you are in complete control of every mark you make on the screen. I almost never feel like I need to find a way around a function to get the sort of control I want. The design decisions that went into creating the Processing API are fabulous. Really.

As for the drawbacks, there aren’t any really great libraries (yet) for basic user interface widgets. Which for me is ok because I’m kinda neurotic about how my scroll bars look and act. But for graphics beginners this can be a real time-sink. Same goes for more sophisticated types of visual representations like basic charts, maps, and networks. Other languages like Protovis provide built-in algorithms for handling these very common types of representations. In Processing, you’ll have to implement your own graph layout algorithm (or, find one on the web). Again, this can be a hurdle for people with less programming experience.

And as a small gripe — Processing has implementations of Bezier and Catmull-Rom curves … but where is the love for b-splines???

Ok, I am a beginner and I want to learn Processing, where do I start?

Go to Click on the Learning tab. Explore.

On the Learning page you’ll find a whole series of tutorials and examples that can walk you through the basic functions of Processing. The next step is to peruse the inspiring demos in the Exhibition, many of which will include example code. When you see a function you don’t understand, the Reference page has wonderful documentation for the language.

If you are new to programming or graphics programming the two books I recommend are:

You can work through the Getting Started book in a day. It’s short and sweet. If you find that you need more help, the Shiffman book includes more details on how to program and lots of paper and pencil, and coding, exercises. Daniel Shiffman wrote this book from course notes he created in teaching design students at NYU about coding and Processing. It’s intro to programming via Processing.

If you are an experienced graphics programmer all you need is what you can find on the Processing website.

How is the learning curve vs. return-on-investment of Processing?

If you know OpenGL and are familiar with Java, the learning curve is super short and shallow. If you are new to graphics, it will take you less time to wrap your head around Processing than OpenGL. And if you are new to programming, Processing is a really fun way to learn the basics.

With that said, it is still a programming language. Reading in data from a file requires basic coding skills, as does just about any interesting interactive visualization. You have to be comfortable with for-loops and arrays. Processing makes graphics programming way easier, but it doesn’t automatically generate visual representations of data. You have to code that.

If you want control over every aspect of your visualization and interaction designs, then you really just have to program. Processing is one of the best languages to use for that. If you just want to see what your data looks like, then there are other tools that can do this quickly with built in visual representations (like Tableau, ManyEyes, Matlab, R, etc).

What other tools would you recommend other than Processing?

I’d recommend any of the tools and languages I’ve mentioned previously. Another gem is ColorBrewer for selecting great colormaps.

Still, nothing beats OpenGL for truly understanding how graphics works. If you are serious about developing interactive visualizations, I think that taking an intro to graphics course that uses OpenGL is invaluable. Understanding the rendering pipeline and how it is implemented in a computer will make the seemingly quirky aspects of even a language like Processing make sense.

New Series: Tools from the Pros

the professionalHallo everyone! I am happy to introduce a new series nested in the Data Visualization Beginners Toolkit: “Tools from the Pros”.

In my last post on data visualization tools I suggested a number of strategies to choose the best tool for you and I provided a list of those I think are the best bets currently available. Now, while I think this list is already very useful, I decided to give you more and I interviewed one data visualization expert for each tool mentioned in the list.

I will be publishing the interviews during the next weeks. Some of them are still in preparation and the list might be expanded in the future as comments and requests come in (please feel free to ask!) What I can tell you from now is that I have the following interviews in editing stage and that the first one will come very very soon:

Each one is a real pro in his area and knows very well the tool he or she uses to make effective visualization. I am sure you will get a lot of useful information out of them.

To each one I asked the following questions:

  • How did you start doing visualization with X?
  • What’s the best and worst aspect of X?
  • Ok, I am a beginner and I want to learn doing visualization with X, where do I start?
  • How is the learning curve vs. return-on-investment of X?
  • What other tools would you recommend other than X?

I hope you’ll enjoy it. Stay tuned! The first one is coming very soon.

Important: if you are an expert and are willing to answer these questions about your favorite tool I’d be happy to include you in the list!

More important: specific request for other person/tool interviews are welcome! Who else would you like me to catch? About which tool? I cannot assure you anything but I’d love to receive your requests.

Take care guys, and have fun.

How to Become a Data Visualization Freelancer | Interview with Moritz Stefaner

I kept my promise: the interview with Moritz Stefaner on data visualization freelancing is finally here! And I am really excited.

As I said in my introductory post, I think data visualization freelancing is one of the most exciting trends in visualization; even though it’s a little bit hidden. After recording the interview, I must say I am really satisfied. I learned something out of it and I am sure the same will be totally true for you.

The video is a bit long (see the content breakdown below) but it’s really worth it: we covered a very large number of questions and they all came directly from the readers (thanks to all of you guys)!

Any comment, question, or suggestion for me and Moritz is more than welcome. You can write a comment here below or contact us directly on twitter (@FILWD, @moritz_stefaner). Have fun!

Video content breakdown with timing:

  • [01:00] Starting Out (building a portfolio)
  • [08:30] Design Practice (iterative approach, designing 20 prototypes!)
  • [16:18] Skill Building and References (books, tools and libraries, doing without programming?)
  • [27:47] Dealing with Clients (what clients want vs. what is right, freelancer vs. agency)
  • [34:08] Pricing (billable time, tracking yourself, strategic prices, the “pain coefficient”)
  • [39:44] Time Management (avoid working 24/7, have kids!, having a rhythm)
  • [41:55] The Freelancing Market (gaps in the market)
  • [44:05] The Role of Research (searching and reading papers)
  • [47:24] Summary of Tips for Wannabe Freelancers

Additional versions of the interview

  • Download mp3 file to listen on your own player.
  • Interview transcription (if you want to read it) – coming soon.

Do you want more? Let us know.

As you can see in the interview, me and Moritz are thinking of recording additional videos. Who knows … this might even become a regular meeting. We would love to have your opinion on that. Especially we would love to know: what else would you would like to hear? Is there anything we missed? How can we help you further? For sure, we would like to record a new one with a more extensive discussion of design practices and the overall data visualization process. Stay tuned and let us know!


  •  First of all thank you guys for sending all your questions for Moritz! This was very useful and I am sure the interview is much much better than what it would be without your help.
  • A big big thank to Moritz. I really enjoyed talking with him (as usual) and I think the end-result is really helpful for people who want to know more about freelancing.
  • The quality of the video is not perfect, I apologize. There is so much to learn! My phone started ringing at some point, the line dropped because I forgot to plug the power plug, and there’s no video editing apart from very basic stuff. Nonetheless, I think that content is king and what matters is that you are going to learn something. This is work in progress and it will get better.
As usual, I’d love if you could help me spreading the word. Please retweet the post if you like it and add comments below. There’s more to come.

Enjoy it and take care,

Ever Dreamed of Becoming a Data Visualization Freelancer? Ask to Moritz How.

freelancingOne of the most exciting and silent phenomenon I have seen developing during the last years/months in data visualization is the growing number of people who are transitioning or already succeeded to make a living by being a data visualization freelancer.

Who is a data visualization freelancer? Basically a self-employed person who sells data visualization services to companies and institutions. What kind of services? I don’t know … the sky is the limits.

There are people like Jan Willem Tulp, who designs pleasurable and accurate visualizations and is just turning into a full-time information visualizer. There is Andy Kirk, who writes the notable blog and teaches visualization in 1-day workshops. There is of course a veteran like Stephen Few who consults big BI companies like Tableau Software and writes among the best books in the field. And people like David McCandless who designs insanely popular (and discussed) beautiful visualizations.

Finally, there is Moritz Stefaner who works from home and seems to have the pretty rare ability to conjugate aesthetics with function and make everyone happy; as with the lately acclaimed OECD Better Life Index.

The brief story of me and Moritz at the airport.

And it’s exactly by discussing with Moritz that I came up with the idea of digging deeper into this fantastic world. The thing went more or less like that: I met Moritz at the airport, when we were both invited for a panel at Visualizing Europe, and during a casual chat he told me something along these lines: “You know Enrico … academia and research are cool but at some point I wanted to make real stuff for real people“. Yeah sure I understand. “… I always had small consultancy jobs during my studies so I had some experience … at some point I decided to become a data visualization freelancer“. Cool! “… so you know what? I work from home, I can plan the time myself  and make sure I play with my kids and talk with my wife” Uh!? “… and of course I am being successful and I am invited here and there so I am travelling a bit around the world and meeting interesting people“.

Ok, are you salivating already or what? I must admit it, despite I really love my academic job, I felt a bit jealous.

Send your questions! I will interview Moritz next week.

Ok so … I will be interviewing Moritz next week about data visualization freelancing. I started collecting a number of questions for him but I need your help! What would you like to ask to Moritz? What are you curious about? Is there a nasty question no one has the courage to ask? I think it would be much much better if you guys tell me what *you* want to know. So, don’t miss this opportunity. You could realize that being a data visualization freelancer is not a dream. It’s definitely possible! And Moritz can tell you how or at least provide some indications.

(On a side note, I will be experimenting with skype-based video interviews for the first time. I am totally excited by this new format and I’d love to have your feedback once it is done. The only video I currently host here is my interview with JD Fekete on Jacques Bertin, but the quality is really bad. I did some initial tests and the results look amazing. I really hope you will like it.)

Few additional reflections.

1st – Freelancing and working from home is not exclusive to data visualization, it is part of a bigger trend and it’s in my opinion absolutely awesome. The web is full of bloggers and small entrepreneurs that make a living by writing their blogs and giving their services by working from home. If you want to know more give a read to the super-successful 4-hour workweek. Note: some people love it and some people absolutely detest it, but no one can deny it captures a strong trend in our society. You can decide to ignore it or to read it and take the risk to have your life changed. You decide.

2nd – Being a professor used to be one of the best jobs in the world for the amount of freedom professors have (and still has a large degree of benefits in my opinion) but being a freelancer and working from home is a good competitor. Academia has its glow of knowledge and a little bit of mystery on its side but this whole segmentation of professions is going to disappear anyway. I see academia and freelancing somewhat similar as they both feature really special amounts of freedom. Academia maybe gives more opportunities to really do whatever comes to your mind but freelancing, as far as I can understand, has a considerable reduction in the amount of bureaucracy you have to bear with.

3rd – Most importantly, I believe freelancing is just a perfect fit for data visualization. I am not a big fan of generalist visualization because I think people get the best out of it when it meets the specific needs of a project. And this happens when you have a competent person able to listen, understand, and offer a tailored solution. For this reason I am a strong proponent of data visualization freelancing. It pisses me off that nobody is talking about it because it’s really a great trend. Plus, the more competent visualization designers we will have around, the more we will be able to show people great examples to take inspiration from.

Again: send  your questions!

Ok, let me repeat it again. I will be interviewing Moritz some time next week. This means the interview will appear here no sooner than about 10 days. I will be able to take your questions into account only if they come during the next few days. Don’t miss this opportunity, send your questions or ideas to me and Moritz. The easiest way is to add a comment here below. Otherwise you can contact us on twitter @FILWD and @moritz_stefaner or send me a private message.

Take care, have fun. We are waiting for your input!

Video Interview: JD Fekete talks about Jacques Bertin

As you might know Jacques Bertin, the great geographer who laid the foundations of information visualization, passed away few months ago.

During the last VisWeek Jean-Daniel Fekete gave a Salute to Jacques Bertin and I was totally amazed to know all these details about him: the impact he had on our field, the way he thought about visualization, the unexplored gems hidden in his Semiology of Graphics, etc.

Then, I met Jean-Daniel at the Dagstuhl seminar on Visual Analytics (from where I am writing this post) and I wanted to know more. He actually met Jacques Bertin few years ago and I really wanted to know more about him. Plus, I thought I could making a video interview out of it! And here it is.

Interview: Jean-Daniel Fekete talks about Jacques Bertin from FILWD on Vimeo.

And oh yes … if you don’t know Jean-Daniel yet, you’d better give a look to AVIZ, the research group he leads. They have lots of solid and cool stuff. Check it out!

Note: please forgive me, the video is totally unprofessional. The intro is too long, our faces are reddish, the sound is barely ok, and my english is sloppy (I was kind of nervous :-)), etc … but ok, the environment is nice and the content is great, be sure to check it out. Jean-Daniel reveals many interesting things.

(Many thanks to Florian for helping me and to Silvia and Carsten for being present (behind the curtains) and making the whole thing even funnier.)

I hope you’ll enjoy it. Let me know!