New Zealand story

I’m back in Dublin from my two-week expedition to New Zealand, the main reason for which was (ostensibly) to attend the IEEE Congress on Evolutionary Computation in Wellington. I’ve been back since Saturday actually, so by now the worst of the jet lag is behind me and it’s time to do a write-up of my doings and dealings down under. Besides NZ, I had the opportunity to pay a quick visit to Australia as well, since I had a stopover in Sydney that lasted from 6am to 6pm – plenty of time to catch a train from the airport to Circular Quay and snap some smug selfies with the famous opera house prominently in the background.

Having the long break between flights in Sydney proved a good decision, because even though the final hop from Sydney to Wellington was a relatively short one, by this point I had already flown seven and a half hours from Dublin to Dubai, followed by a two-hour stopover before the connecting flight to Sydney, which was just shy of fourteen hours. As a result of all this I wasn’t in much of a mood to do any more flying until I was well and truly rid of the stiffness of body and mind that comes from spending 20-plus hours seated inside a cramped aluminium tube in the sky, and a few hours of sightseeing on foot on what turned out to be a pleasantly warm and sunny day helped a great deal in achieving that. Another move I thanked myself for was having purchased access to the Qantas business lounge at Sydney airport, allowing me to enjoy such welcome luxuries as a comfy chair, a barista-made espresso and a nice shower before facing the world outside.

With the combined effect of the flight and transfer times and the 11-hour time difference, I arrived in Wellington near midnight on the evening of Sunday the 9th, having departed from Dublin on Friday evening. Monday the 10th was the first day of the conference, but it was all tutorials and workshops, none of which were particularly relevant to my own research, so I gave myself permission to sleep in and recharge before attempting anything resembling work. In fact the only “conference sessions” I attended on that first day were lunch and afternoon coffee; the rest of the time I spent at the venue I just wandered around Te Papa, exploring the national museum’s fascinating exhibitions on the nature, culture and history of New Zealand.

On the second day I began to feel the effects of jet lag for real, but I thought it was time to be a good soldier and check out some presentations. Although I don’t really do evolutionary computation myself, it has various applications that interest me professionally or personally, so it wasn’t too hard to find potentially interesting sessions in the programme. The highlight of the day for me was a session on games where there was, among others, a paper on evolving an AI to play a partially observable variant of Ms. Pac-Man; being a bit of a retrogaming geek, I found it quite heartwarming that this is an actual topic of serious academic research!

On the third day I forced myself to get up early enough to hear the plenary talk of Prof. Risto Miikkulainen, titled “Creative AI through Evolutionary Computation”. I was especially looking forward to this talk, and I was not disappointed: Prof. Miikkulainen built a good case for machine creativity as the next big step in AI and for the crucial role of evolutionary computation in it, with a variety of interesting supporting examples of successful applications. I am inclined to agree with the audience member who remarked that the conclusions of the talk were rather optimistic – it’s quite a leap from optimising website designs to optimising the governance of entire societies – but even so, a highly enjoyable presentation. Later that day there was a special session on music, art and creativity, which I also attended, but my enjoyment of it was hampered by my being in acute need of a nap at this point.

The fourth and final day of the conference I mostly spent preparing for my own presentation, which was in the special session on ethics and social implications of computational intelligence. This took place in the late afternoon, so the conference was almost over and attendance in the session was predictably unimpressive: I counted ten people, including myself and the session chair. Fortunately, numbers aren’t everything, and there was some good discussion with the audience after my talk, which dealt with wearable self-tracking devices and the problems that arise from the non-transparency of the information they generate and the limited ability of users to control their own data. I also talked about the problems and potential social impact of analysing self-tracking data collaboratively, tying the paper up with the work I’m doing in the KDD-CHASER project.

After the conference I proceeded to have a week’s vacation in NZ, which of course was the real reason I went to all the trouble of getting myself over there. While it’s not a huge country – somewhat smaller than my native Finland in terms of both area and population – I still had to make some tough choices when deciding what to see and do there, and I came to the conclusion that it was best to focus on what the North Island has to offer. I rode the Northern Explorer train service to Auckland and spent three nights there before working my way back to Wellington by bus, stopping along the way to spend two nights in Rotorua. From Wellington I did a day trip by ferry to Picton, a small town in the Marlborough Region (of Sauvignon blanc fame) of the South Island.

On Friday, two weeks after my departure from Dublin, I started my return journey, this time via Melbourne and with no time to go dilly-dallying outside the airport between flights. I boarded my flight in Wellington feeling a little sad to be leaving NZ so soon, but also satisfied that I’d made the most of my time there. I might have been able to fit in some additional activities if I’d travelled by air instead of overland, perhaps even another city, but I like to be able to view the scenery when I’m travelling, and there was no shortage of pretty sights along the train and bus routes. The conference also left a positive feeling: the programme was interesting, the catering was great and the choice of venue just brilliant. Above all, I’m happy to be done with all the flying!

Getting fit, bit by bit

I’ve been making decent progress on my software, and while it’s no good yet for any kind of data analysis, it can already be used to do a number of things related to the management of datasets and collaborations. I may even unleash the current incarnation upon some unsuspecting human beings soon, but for now, I’m using myself as my first guinea pig, so I’ve started wearing one of the Fitbits I bought myself (or rather, for my project) for Christmas. From the perspective of my research, the reason for this is that I need to capture some sample data so I can get a look at what the data looks like when it’s exported from the Fitbit cloud into a file, but I’m also personally interested in seeing firsthand what’s happened in fitness trackers since the last time I wore one, which was quite a few years ago and then also for research purposes.

Back then I wasn’t hugely impressed, but it seems that by now these gadgets have advanced enough in terms of both functionality and appearance that I would consider buying one of my own. My initial impression of the Fitbit was that it’s quite sleek but not very comfortable; no matter how I wore it, it always felt either too loose or too tight. However, it seems that I either found the sweet spot or simply grew accustomed to it because it doesn’t bother me that much anymore, although most of the time I am still aware that it’s there. I’m probably not wearing it exactly as recommended by the user manual, but I can’t be bothered to be finicky about it.

By tapping on the screen of the device I can scroll through my basic stats: steps, heart rate, distance, energy expenditure and active minutes. More information is available by launching the Fitbit app; this is where I see, for example, how much sleep the device thinks I’ve had. Here I could also log my weight and what I’ve eaten if I were so inclined. Setting up the device and the app so that they can talk to each other takes a bit of time, but after that the device syncs to the app without any problems, at least on Windows. However, for some reason the app refused to acknowledge that I’m wearing the Fitbit on my right wrist rather than my left; this setting I had to make on the website to make it stick. The website is also where I export my data, which is quick and straightforward to do, with a choice between CSV or Excel for the data format.

The accuracy of the data is not really my number one concern, since I’m interested in the process of collaborative data analysis rather than the results of the analysis. However, on a personal note again, it is interesting to make observations on how the feedback I get from the device and the app relates to how I experience those aspects of my life that the feedback is about. For example, I can’t quite escape the impression that the Fitbit is flattering me, considering how consistently I’ve been getting my daily hour or more of activity even though in my own opinion I certainly don’t exercise every day. On the other hand, I do get a fair bit of walking done on a normal working day, including a brisk afternoon walk in the park next to the university campus whenever I can spare the time, so I guess it all adds up to something over the course of the day.

Based on my fairly brief experience, I can already see a few reasons for the rising popularity of wearables such as the Fitbit. Even if the accuracy of the data in terms of absolute values leaves something to hope for, presumably the device is at least reasonably consistent with itself over time, so if there are any rising or falling trends in your performance, they should be visible in the data. To make the product more friendly and fun to use, the developers have used a host of persuasion and gamification techniques; for example, there are various badges to be earned, with quirky names like “Penguin March”, and occasionally the device gets chatty with me, offering suggestions such as “take me for a walk?”. When I reach the daily magic number of ten thousand steps, the Fitbit vibrates a little silent congratulatory fanfare on my wrist.

In terms of what I need to carry out my project, the Fitbit will definitely serve: setting it up, syncing it and exporting the data all seem to work without any big hassle. As for whether I’m going to get one for myself, I would say that it’s now more likely than before that I will get some kind of wearable – not necessarily a Fitbit, but one that will give me the same kind of information anyway. Having this opportunity to try out a few different ones is an unexpected perk of the project that I now suddenly welcome, even though I wasn’t particularly interested in these devices when I was applying for the grant.

Getting engaged

Besides research, one of the things I’m supposed to be doing as a Marie Curie research fellow is learning new things. Of course, that’s a good thing to be doing regardless of what other things you do, but in the case of the fellowship, it’s expected of me by the funder that I spend my time in Dublin doing things that will help me develop myself and my career prospects. I’ve already learned quite a few useful new things through my research work here, but I’ve also attended a number of training courses and workshops on various topics, and last week I had the opportunity to go to a particularly interesting one dealing with engaged research.

I learned about the workshop from the education and public engagement manager of the Insight Centre, who sent me an email about it and recommended that I sign up. I wasn’t previously familiar with the concept of engaged research, but as I was reading the description of the workshop, it soon became clear that it applies to quite a few, perhaps most, of the research projects I’ve been involved in over the course of my career so far. The gist of how this concept is defined is that it describes research where the individuals or organisations for whom the research is relevant are engaged to be involved in it, not merely as recipients of the eventual results but as co-creators of them. In my case the engaged partners have mostly come from industry, but they could also represent the public sector, civil society or the general public.

The workshop was facilitated by people from Campus Engage, a network that aims to “promote civic and community engagement as a core function of Higher Education on the island or Ireland”. Since Finnish universities have had social influence as their so-called third mission (research and education being the first two) for quite a years now, this statement also rings very familiar. A few days before the workshop, we were requested to fill out a survey with questions about our background and what sort of lessons we were particularly hoping to take home, which the facilitators then used to tailor the content to the interests of those attending the training.

A whole day of training can get very boring if it’s not well planned and executed, but there was no such problem here, as the presentations given by the facilitators were interspersed with discussions of our own questions and experiences, as well as small group activities. The latter involved, for example, studying an extract from a research project proposal and coming up with ways to improve it in terms of stakeholder engagement. One of the things I was hoping the training would give me was information and ideas on how to develop the engagement aspect of my own proposals, and this certainly qualified, although strictly speaking it was perhaps more of a cautionary example of how it’s not meant to be done. We did get more constructive planning tools as well, such as the logic model, a way of planning for long-term impact by laying out the path there as a series of if-then relationships starting with an analysis of the current situation.

Another thing about the workshop that I enjoyed was that we discussed some actual real-world cases of community engagement in action. A particularly interesting one was Access Earth, a mobile app that can be used by people with accessibility needs to find and rate hotels and restaurants by criteria such as accessible parking and wide doors. Clearly the key to successful implementation and deployment of such an application is engaging the people who are going to use it, both to get the design right and to collect the data on the accessibility of various places around the world, and one of the facilitators of the workshop has been working on the project as a community engagement advisor. The app is available worldwide, so the potential impact on the lives of people with disabilities is big – an inspiring example of what engagement is in practice and what it can accomplish.

Philosophical ruminations vol. 1

Holidays are over and I’m back from the Finnish winter wonderland in Ireland, who seems to retain an appreciable fraction of her famous forty shades of green even in the middle of winter. No sign of the pre-Christmas frenzy anymore – I’ve been working at a fairly leisurely pace for these past few weeks, enjoying the luxury of being able to take the time to have a good think about what I’m doing before I actually do it. The only deadline of immediate concern was the extended deadline of the conference for which I was preparing a paper before the holidays, and since I didn’t dare rely on there being an extension, I all but finished the manuscript during the break, so there wasn’t much left for me to do to it after I got back on the clock.

Since things are not so hectic now, I thought this would be a good time for a post discussing a topic that’s not directly concerned with what’s going on in my project at the moment. When I started the blog, my intention was that one of the themes I would cover would be the philosophical dimension of knowledge discovery, and there’s a certain concept related to this that’s been on my mind quite a lot lately. The concept is known as epistemic opacity; that’s epistemic as in epistemology – the philosophical study of knowledge – and opacity as in, well, the state of not being transparent (thanks, Captain Obvious).

I ran into this concept in a paper by Paul Humphreys titled “The philosophical novelty of computer simulation methods”, published in the philosophy journal Synthese in 2009. Humphreys puts forward the argument that there are certain aspects of computer simulations and computational science that make them philosophically novel as methods of scientific enquiry, and one of these aspects is their epistemic opacity, which he defines as follows:

[…] a process is epistemically opaque relative to a cognitive agent X at time t just in case X does not know at t all of the epistemically relevant elements of the process. A process is essentially epistemically opaque to X if and only if it is impossible, given the nature of X, for X to know all of the epistemically relevant elements of the process.

That’s a bit of a mouthful, but the gist of it – as far as I understand it – is that computer simulations are opaque in the sense that there is no way for a human observer to fully understand why a given simulation behaves the way it does. This makes it impossible to verify the outcome of the simulation using means that are independent of the simulation itself; a parallel may be drawn here with mathematics, where there has been criticism of computer-generated proofs that are considered non-surveyable, meaning that they cannot be verified by a human mathematician without computational assistance.

The philosophical challenge here arises from the fact that since we have no means to double-check what the computer is telling us, we are effectively outsourcing some of our thinking to the computer. To be fair, we have been doing this for quite some time now and it seems to have worked out all right for us, but in the history of science this is a relatively new development, so I think the epistemologists can be excused for still having some suspicions. I doubt that anyone is suggesting we should go back to relying entirely on our brains (it’s not like those are infallible either), but I find that in any activity, it’s sometimes instructive to take a step back and question the things you’re taking for granted.

The algorithms used in knowledge discovery from data can also be said to be epistemically opaque, in the sense that while they quite often yield a model that works, it’s a whole different matter to understand why it works and why it makes the mistakes that it does. And they do make mistakes, even the best of them; there’s no such thing as a model that’s 100% accurate 100% of the time, unless the problem it’s supposed to solve is a very trivial one. Of course, in many cases such accuracy is not necessary for a model to be useful in practice, but there is something about this that the epistemologist in me finds unsatisfying – it feels like we’re giving up on the endeavour to figure out the underlying causal relationships in the real world and substituting the more pedestrian goal of being able to guess a reasonably accurate answer with adequate frequency, based on what is statistically likely to be correct given loads and loads of past examples.

From a more practical point of view, the opacity of KDD algorithms and the uncertainty concerning the accuracy of their outputs may or may not be a problem, since some users are in a better position to deal with these issues than others. Traditionally, KDD has been a tool for experts who are well aware of its limitations and potential pitfalls, but it is now increasingly being packaged together with miniaturised sensors and other electronics to make a variety of consumer products, such as the wearable wellness devices I’m working with. The users of these products are seldom knowledge discovery experts, and even for those who are, there is little information available to help them judge whether or not to trust what the device is telling them. The net effect is to make the underlying algorithms even more opaque than they would normally be.

Now, I presume that by and large, people are aware that these gadgets are not magic and that a certain degree of skepticism concerning their outputs is therefore warranted, but it would be helpful if we could get some kind of indication of when it would be particularly good to be skeptical. I suspect that often it’s the case that this information exists, but we don’t get to see it basically because it would clutter the display with things that are not strictly necessary. Moreover, this information is lost forever when the outputs are exported, which may be an issue if they are to be used, for instance, as research data, in which case it would be rather important to know how reliable they are. I’d be quite interested in seeing a product that successfully combines access to this sort of information with the usability virtues of today’s user-friendly wearables.

Busy times

With the end-of-year holidays approaching, things tend to get busy in a lot of places, not just in Santa’s workshop. My life in Ireland is no exception: there are five major work-related (or at least university-related) things that I’ve been trying my best to juggle through November, with varying success. Many of these will culminate over the next two weeks or so, so after that I’m hoping it will be comparatively smooth sailing till I leave for my well-deserved Christmas break in Finland. The blog I’m not even counting among the five and I’ve been pretty much neglecting it, so this post is rather overdue, and also a welcome break from all of the more pressing stuff that I should really be working on right now.

One area where I’ve had my hands full is data protection, where it seems that whenever a document is finished, there’s always another one to be prepared and submitted for evaluation. Getting a green light from the Research Ethics Committee was a big step forward, but there’s now one more hurdle left to overcome in the form of a Data Protection Impact Assessment. I’m very much learning (and making up) all of this as I go along, and the learning curve has proved a rather more slippery climb than I expected, but I’m getting there. In fact, I’m apparently one of the first to go through this process around here, so I guess I’m not the only one trying to learn how it works. I hope this means that things will be easier for those who come after me.

Meanwhile, I’ve been preparing to give my very first lecture here at DCU – thankfully, just one guest lecture and not a whole course, but even that is quite enough to rack my nerves. It is a little strange that this should be the case, even after all the public speaking I’ve had to do during my fifteen-plus years in research, but the fact of the matter is that it does still feel like a bit of an ordeal every time. Of course it doesn’t help that I’m in a new environment now, and also I’ll be speaking to undergraduate students, which is rather different from giving a presentation at a conference to other researchers. Still, I’m not entirely unfamiliar with this type of audience, and I can recycle some of the lecture materials I created and used in Oulu, so I think I’m going to be all right.

Speaking of conferences, I’m serving in the programme committee of the International Conference on Health Informatics for the second year running and the manuscript reviewing period is currently ongoing, so that’s another thing that’s claimed a sizable chunk of my time recently. Somewhere among all of this I’m somehow managing to fit in a bit of actual research as well, although it’s nowhere near as much as I’d like, but I guess we’ve all been there. The software platform is taking shape towards a minimum viable product of sorts, and I have a couple of ideas for papers I want to write in the near future, so there’s a clear sense of moving forward despite all the other stuff going on.

So what’s the fifth thing, you ask? Well, I’ve rekindled my relationship with choral singing by joining the DCU Campus Choir, having not sung in a proper choir since school. Despite the 20-year gap (plus a bit), I haven’t had much trouble getting into it again: I can still read music, I can still hit the bass notes, and I don’t have all that much to occupy myself in the evenings and weekends so I have plenty of time to learn my parts (although I’m not sure how happy my neighbours are about it). The material we’re doing is nice and varied, and the level of ambition is certainly sufficient, as it seems like we’re constantly running out of rehearsal time before one performance or other. Our next concert will be Carols by Candlelight at DCU’s All Hallows campus on the evening of Monday the 10th of December, so anyone reading this who’s in town that day is very warmly welcome to listen!

A Solid foundation for social apps?

Tim Berners-Lee recently posted an open letter on the web, announcing the launch of Solid, a new technology platform that he and his team at MIT have been working on for the past few years, to the wider online community. Like a lot of people these days, he’s not too happy about the way our personal data is being controlled and exploited by providers of online services, but when the father of the web is telling you how it’s not gone the way he intended, you may want to prick up your ears even if you personally have no problem with the way things are. Not only that, but when he says he’s come up with something that we can use to set things right, it’s probably worth checking out.

We’ve all seen the headlines that result when a company with a business model based on aggregating and monetising personal data gets negligent or unscrupulous with the data in its possession, but these incidents are really just symptoms of a more fundamental issue concerning the architecture of basically every popular online social application out there. Even if we imagine a perfect world of ideal application providers that are completely open and honest about how they use your data and never suffer any security breaches, the fact remains that they, not you, control the data you’ve given them. You still own it, yes, but they control it.

Why is this an important distinction? The answer has to do with the coupling of your data with the specific services you’re using: you can’t have one without the other. As a result, your data is broken up into pieces that are kept in separate bins, one for each service, even when it would be really to helpful to have it all in the same place. If you want to use several services that all use the same data, you have to upload it to each one separately, and that’s assuming that you have or can get the data in a reusable format, which isn’t always the case. It would make a lot more sense to have just a single copy of the data and permit the services to access that – within privacy parameters that you have complete control of – and it would be even better if you could move your data to a different location without breaking all those services that depend on it.

Sound good? Well, the people behind Solid apparently want you to be able to do just that. Their proposed solution is based on decoupling data from applications and storing it in units called PODs (short for personal online data store). Applications built on the Solid platform can access the data in your POD if you give them permission to do so, but they don’t control the data, so they can’t impose any artificial restrictions on how you use, combine and reuse data from different sources. The end-users of Solid are thus empowered to make the best possible use of their data while retaining full control of what data they disclose and to whom, which is very much what I’m aiming for in my own research; I can easily see collaborative knowledge discovery as an app implemented on Solid or some similar platform.

So that’s the theory, anyway. What about reality? I can’t claim to have examined the platform in great depth, but at least on the surface, there are a number of things that I like about it. It’s built on established W3C specifications in what looks like a rather elegant way where linked data technologies are used to identify data resources and to represent semantic links between them – for example, between a photo published by one user and a comment on the photo posted by another. Besides your data, your POD also holds your identity that you use to access various resources, somewhat like you can now use your Google or Facebook credentials to log in to other vendors’ services, but without the dependence on a specific service to authenticate your identity. Of course, you still need to get your Solid POD from somewhere, but you’re free to choose whichever provider suits you best, or even to set up your own Solid server if you have the motivation and the means.

Whether Solid will catch on as a platform for a new class of social web apps is not just a matter of whether it is technically up to the challenge, though. The point of social media is very much to have everyone in your social network using the same applications, so the early adopters won’t have much of an impact if their friends decide that it’s just so much more convenient to keep using the apps where they already have all their connections and content rather than to switch platforms and build everything up all over again – which, of course, is precisely the sort of thinking the providers of those apps are counting on and actively reinforcing. People like me may give Solid a go out of sheer curiosity, but I suspect that the majority can’t be bothered unless there are Solid apps available that let them do things they really want to do but haven’t been able to before. Taking control of your own data is a noble principle for sure, but is it enough to attract a critical mass of users?

Then there’s the question of how the Solid ecosystem will work from a business perspective. The supply of interesting applications is going to be quite limited unless there’s money to be made by developing them, and presumably the revenue-generation models of centralised social apps can’t be simply dropped in a decentralised environment such as Solid without any modifications. We pretty much take it for granted now that we can “pay” for certain kinds of services through the act of using them and generating data for the service provider to use as raw material for services that the provider’s actual customers will pay good money for, but would – and should – this work if the provider could no longer control the data? On the other hand, would we be willing to pay for these services in cash rather than data, now that we’ve grown so used to getting them for “free”? Then again, there was a time when it was not at all clear how some of today’s multi-billion-dollar companies were ever going to turn a profit, so maybe we just need the right sort of minds to take an interest for these things to get figured out.

It’s also worth noting that Solid is by no means the only project aiming to make the web less centralised and more collaborative. There is a substantial community of researchers and developers working on solutions to various problems in this area, as evidenced by the fact that Solid is but one of dozens of projects showcased at the recent DWeb Summit in San Francisco, so it may well turn out that even if Solid itself won’t take off, some other similar thing will. I won’t be betting any money on any of the contenders just yet, but I probably will get myself a Solid POD to play with so I can get a better idea of what you can do with it.

Rocky road to Dublin

When I first arrived at DCU to begin my MSCA fellowship, my supervisor paraded me around the place introducing me to various people, most of whose names I promptly forgot. (Sorry!) What did stick in my mind, however, were the numerous congratulations I received on winning such a competitive grant, which the Marie Curie fellowship certainly is, and on getting such a sweet deal, which it most definitely is. But it wasn’t all good times and glowing reviews, the way here – far from it, in fact. That’s why I thought I’d share the story of how I came to be in Dublin and how things might have turned out quite differently, had I been a bit less perseverant than I was.

I got my doctorate in 2014 at the relatively mature age of 35, having gone through a process that was a good deal more convoluted than it was, in theory, supposed to be. Following my academic baptism of fire, working as a Master’s student in a project dealing with computational quality assurance of spot welding joints, I embarked on a rather erratic journey that saw me dabble with research topics as varied as exergame design and ethics of scam baiting. I eventually pieced together my dissertation around the theme of knowledge discovery in databases, focusing on what the overall process looks like from different perspectives and how it can be supported effectively.

Not only was the process of writing my thesis unnecessarily complicated, but so was the process of getting the manuscript reviewed and accepted for publication. When the pre-examination phase was finally over and I was galloping, or at least cantering briskly, into the home stretch, there turned out to be one more obstacle to clear: finding an external examiner for the defence. I pitched a number of names to my then-supervisor, but one by one they all respectfully declined the invitation, not having the time to spare or not feeling that the topic of the thesis was close enough to their area of expertise. Fortunately, one of them offered the suggestion that Alan Smeaton of DCU might be our man.

This turned out to be a considerable stroke of luck: Alan was indeed kind enough to accept the job, and his style of handling it was very much what I’d hoped for, making this test of my ability to defend my magnum opus feel much less like a test and more like a friendly conversation on a topic of mutual interest. Some among the audience even described the proceedings as entertaining, which is hardly the word that first comes to my mind when I think about thesis defences! In all honesty, it’s not the word I’d choose to describe mine either – I was far too nervous to be entertained – but it did feel quite rewarding after all the hard work (not to mention a not negligible amount of self-doubt) to talk for several hours as an equal to a senior academic who had taken such an interest in my research.

Having thus finished my doctorate, it was time to make some decisions. I felt that I wasn’t quite done with academia yet, but I also felt that I needed some kind of change, and besides, there’s a certain amount of pressure on fresh post-docs to go explore the world beyond their alma mater for a while. Since I happen to have a great deal of appreciation for many things Irish (including, but not limited to, grain-based fermented beverages and traditional music), this seemed like a potential opportunity to combine business with pleasure, so the next time I visited Dublin, I met up with Alan to have a cup of tea and a chat on the possibility of moving there to work with him, provided that some funding could be secured. Later that year, we submitted our first proposal for an MSCA Individual Fellowship to the European Commission.

MSCA wasn’t the only funding source we considered; I also applied to the Academy of Finland for a grant that would have involved me mostly staying in Finland at the University of Oulu but spending a mobility period of six months or so at DCU. However, the feedback I got from the Academy did not paint a rosy picture of my chances of winning the grant even after several iterations, and in the meantime, I was feeling rather aimless in my research and finding the idea of switching to industry more and more attractive. Unsurprisingly, my publication record for these past few years is not exactly impressive, but thankfully, the MSCA reviewers seemed to be more interested in what I wanted to accomplish than in what I’d (not) accomplished before. With the assistance of DCU’s excellent research support staff, Alan and I were able to put together a good plan, and then, after a very encouraging round of reviews, to make it even better for our second attempt. This was in 2016; in early 2017 we got the notification that the fellowship had been awarded. After that, I went on working in Oulu until the end of the year to complete my contract, took January 2018 off to wind down and make arrangements, moved to Dublin at the end of January and started the new project on the 1st of February.

If there’s a lesson to take home from all of this, I guess it would be that even if you haven’t been the most shining star of your peer group during your PhD studies, that doesn’t mean you’re out of options if you want to keep pursuing a career in research. Sure, some funders may dismiss you on the basis of your CV alone if they don’t think it shows enough potential, but still, a well prepared proposal can go a long way if you know your strengths and build your research plan around them. MSCA Individual Fellowships are perhaps a more forgiving funding source than many others, since they’re explicitly meant to help you advance your career and thus come with the built-in assumption that there’s something important you don’t yet have but can gain by carrying out the right sort of project with the right sort of host. So, if you’ve found yourself a host with a set of known strengths that complement the ones you’ve demonstrated in your previous work, you already have a pretty solid foundation to build your proposal on.

Getting started

Welcome to You Know Nothing, Socrates! The theme of this blog is knowledge, or more specifically – because that sure could use some narrowing down – the intersection of knowledge (in the philosophical sense) and computing. Knowledge, of course, is a notoriously elusive concept once you start trying to pin it down, which is why I’ve decided to name the blog after the famous Socratic paradox, apocryphal though it may be. And before you ask: yes, the title is also a Game of Thrones reference. Get over it.

To make matters worse, we haven’t been content to just assert that we as human beings have the ability to know various things and to derive new knowledge from evidence. Instead, ever since the invention of the modern digital computer, we’ve been very keen on the idea of replicating, or at least imitating, that ability in machines. This pursuit has given rise to fields of computer science research such as knowledge representation and knowledge discovery; this is the area where I’ve been working throughout my career as a researcher, and also the main subject area that I’ll be writing about.

A bit of context: I’m currently working as a Marie Curie Individual Fellow at the Insight Centre for Data Analytics in Dublin, Ireland. The project I’m working on, titled KDD-CHASER, deals with remote collaboration for the extraction of useful knowledge from personal data, such as one might collect using a wearable wellness device designed to generate meaningful metrics on the wearer’s physical activity and sleep. These products are quite popular and, presumably, useful, but for most users their utility is limited to whatever analyses the product has been programmed to give them. The research I’m doing aims for the creation of an online platform that could be used by users of personal data capturing devices to discover additional knowledge in their data with the help of expert collaborators.

As long as the KDD-CHASER project is running, which is until the end of January 2020, I will be using this blog as a communication channel (among others) to share information about its progress and results with the public. However, I’m also planning to post more general musings on topics that are related to, but not immediately connected with, the work I’m doing in the project. These, I hope, will be enough to keep the blog alive after the project is done and I move on to other things. Not that I’m expecting those other things to be radically different from the things I’m involved in at the moment, but hey, you never know.

There certainly isn’t a shortage of subject matter to draw on: besides the under-the-hood mechanics of computers capable of possessing and producing knowledge, there’s the philosophical dimension of them that I’m also deeply interested in – another reason for my choice of blog title. From here it’s not much of a conceptual leap to the even more bewildering philosophical questions surrounding the notion of artificial intelligence, so I might take the occasional stab at those as well. I fully expect to come to the conclusion that I really know absolutely nothing, but whether I’ll be any the wiser for it remains to be seen.