SAGE Concept Grants: Feedback for applicants
Now in its third year, the 2020 SAGE Concept Grant program drew over 140 applications from individuals and teams all over the world building new software tools for social research. In this blog post, we’re giving you an insight into our judging criteria and sharing the most common reasons why applications did not progress further, to serve as feedback for this year’s applicants and guidance for future applicants.
We loved reading the huge array of proposals and want to thank everyone who took the time to submit their ideas and share their passions with us.
We were really impressed by all the amazing research tools that are being developed for social scientists and are committed to promoting as many as we can to the wider community. The most popular themes this year were data collection and analysis (20% of the applications), text mining (14%), social media (6%), online experiments (3%) and qualitative data analysis (3%).
In 2020 we updated our offering to provide one grant of £15,000 to scale up a prototype, and five grants of £2,000 each to applicants with great ideas that need to prove their technical feasibility.
Read on to learn more about the criteria we used in the judging of both grant types, and stay tuned for the announcement of our 2020 winners!
The idea was outside of scope
Our mission is to support social science by equipping social scientists with the skills, tools and resources they need to work with big data and new technology. Our Concept Grants program is one of the ways we provide this support: by seed funding the development of tools and technologies that will help social science researchers do better and more robust research. We are looking for technologies that can solve big pain points that social science researchers currently experience, or support innovative new methods that have a high probability of uptake within this discipline. This is why the proposals we fund must consider the research process in the social sciences and how it can be improved.
The main users were not social science researchers
We received a number of strong applications focused on specific societal challenges that social science researchers, computer scientists and other academics can help solve. Whilst there are many funding opportunities to support these aims, the SAGE Concept Grants aim instead to address the challenges that social science researchers face during the research process itself, such as data access and collection, experiment design, implementation, and other process-focused areas.
If the tool you are developing can solve problems that all researchers, regardless of discipline, would experience, it’s worth considering how social scientists would or would not use it: What might be the bottlenecks or specific alternatives that are common within the social science community? What would hinder or incentivize social science researchers to use your tool as opposed to others? And how does that compare with academics from other disciplines? We know that many of the tools social scientists are using originated in other disciplines, so we definitely consider such tools, but we have to understand how and whether you’ve considered social science academics as one of your use cases. This is particularly important to emphasize and clearly communicate in your application.
The proposal lacked reflection around key users
One of the most useful insights we’ve gathered from working with research technologies is the importance of sizing your market or potential customer base. This information is key to setting realistic goals and planning better ways to reach out to users, and for estimating the impact of the technology. There is, of course, no right answer here and we know it is hard to estimate the size of a potential market, but simply stating that “a lot of researchers would use it” is too vague.
Really think about who your users are: What type of social science researcher are they? What kind of research or analysis do they do? What skills do they already have? Be specific, feel free to use percentages, regions, particular disciplines, the list goes on. “All academics” or “all social science researchers” is too broad - they do too many different things. Perhaps you are working with German-speaking sociologists that use social media, or English-speaking political scientists that work with large corpora but have no coding skills. There are a number of ways to estimate a number of potential users, none of which is perfect, but either would help you understand your target users.
A good way of approaching this is to break down your target users by group: The initial beta testers with a very specific set of features, who they are and how many can you reach out to; another larger cohort of early testers; and a plan to reach an even larger number of users, perhaps with a slightly more general set of features.
Weak market research
A large part of our work at SAGE Ocean is finding and reviewing tools and technologies that social science researchers use in their research projects. From all our exploration, there is one thing we have found to be certain - that academics are both innovative and creative. They will go out of their way to figure out how to do something, even when there is no technology that can easily support that. This is why if you say there is no direct competitor to your tool, we consider that weak market research: There is always an alternative that some academics are already using, DIY or not. A good starting point for getting to know the landscape is the list we compiled of more than 500 tools and the associated whitepaper.
It’s worth noting that a competitor or an alternative is not necessarily an exact copy of your technology; instead, it is the next best solution that your target researcher is currently using.
Poor definition of the problem space
One of the most difficult, yet vital questions to ask yourself is, what is the problem you are trying to solve? The definition of the problem must be clear and succinct. It is important to illustrate the extent of the problem from the perspective of the user or researcher, rather than simply focus on the features of your tool in their own right. It might be that there’s nothing like your tool that currently exists, but that being the case is not a sufficient analysis of the problem space. It’s worth considering why it doesn’t exist: Is the problem space only just emerging? What other factors are affecting how researchers are working in this space? And crucially, is the problem big enough to justify building a new tool to address it?
We would have liked to see more applicants describing how they would test their business model, instead of relying on a ‘build it and they will come’ approach. We want to encourage applicants to think about the life of the tool after the grant, and consider ideas for ensuring the tool’s sustainability. With the SAGE Concept Grants we support all models that can help sustain a tool for the long-term, be it open or closed source, free for academics or not.
We look forward to reading next year’s applications! If you have suggestions for how we could improve our application process, please do comment or contact us at ocean@sagepub.com.