You’ve worked hard putting together a presentation so why limit it to the people sitting in your talk?
Here are a few tips for repurposing your talk for sharing on social media
1. Add your twitter handle on your intro slide and encourage your audience to tweet. Limiting the text on your slides will also encourage tweets. If your text isn’t tweetable then there’s too much text.
Even if people are reading it, they’re not listening to you
May 7, 2018 • 23 tweets • 8 min read
If you’re an academic you need a website so that people can easily find info about your research and publications. Here’s how to make your own website for free in an under an hour using the blogdown package in #Rstats [THREAD]
So why use blogdown? Sure, there are several free options available to start your own blog (e.g., Medium). However, you generally can’t list your publications or other information easily on these services. Also, who knows where these services will be in a few years?
May 4, 2018 • 6 tweets • 4 min read
Every paper has open data if they present a scatterplot.
1) Download WebPlotDigitizer automeris.io/WebPlotDigitiz… 2) Load a scatterplot screenshot 3) Select each datapoint 4) Download the .CSV file with each datapoint
We used this tool in a recent meta-analysis to extract correlation coefficients from papers that didn't report coefficients (only scatterplots), which is a common issue in meta-analysis sciencedirect.com/science/articl…
May 3, 2018 • 20 tweets • 7 min read
Funnel plots are often used to assess publication bias in meta-analysis, but these plots only visualise *small study* bias, which may or may not include publication bias. Here's a guide on making contour-enhanced funnel plots in #Rstats, which better visualise publication bias
First, some background.... Publication bias is a well-known source of bias. For instance, researchers might shelve studies that aren’t statistically significant, as journals are unfortunately less likely to publish these kind of results.
Apr 26, 2018 • 11 tweets • 3 min read
Here are ten things that I HAVE NOT changed my mind about in the past few years of being a scientist (thanks to @mareberl for the suggestion)
1. Meta-analysis is a useful means of synthesizing research
Meta-analysis cops a lot of flak. But like ANY statistical tool, meta-analysis needs to be correctly applied. It still sits on top of the evidence pyramid osf.io/yq59d/
Apr 24, 2018 • 13 tweets • 6 min read
Here are ten things I’ve changed my mind about in the last few years of being a scientist
1. P-values are bad.
Nope. P-values are good for what they’re designed to do. Just because they’re (often) misused doesn’t mean that we should abandon them.
Apr 23, 2018 • 20 tweets • 6 min read
Meta-analyses are often used as a gold standard measure of evidence. But how much trust should you place in a meta-analysis outcome? Here are a few things you should look out for next time you read one [THREAD]
1. Don’t just check whether the authors state they followed PRISMA/MARS reporting guidelines, check whether they ACTUALLY did.
SPOILER ALERT: Very few do. The ones that do *typically* include a checklist in the supplement
Feb 10, 2018 • 17 tweets • 7 min read
Last year I posted a preprint.
Doing this set off a chain of events that convinced me I should post a preprint for ALL my manuscripts.
Here’s my story (1/17)
First, some background.
Over the past few years, I’ve become interested in using Bayesian inference to complement my frequentist inferences. So, I put together a short presentation on Bayesian alternatives for NHST for my research group. (2/17)
Feb 7, 2018 • 20 tweets • 7 min read
People love hating on PowerPoint presentations. But the problem isn't with PowerPoint, it's with presentation delivery.
Here are a few tips on improving your next PowerPoint presentation [THREAD]
Great presentations tend to include these three elements: They're clearly communicated, contain eye-catching images, and tell a compelling story
Feb 3, 2018 • 32 tweets • 12 min read
I’ve peer reviewed A LOT of meta-analyses over the past few years. While I’ve noticed that the overall quality of meta-analyses are improving, many that I review suffer from the same issues. Here’s a thread listing the most common problems and how to avoid them
#1 Not using PRISMA reporting guidelines (or stating that PRISMA guidelines were followed, but not *actually* following them).