As a English Wikimedian, I want to understand the impact of the Thank You page experiment on English Wikipedia, so that I can decide is we should continue this experiment.
https://www.mediawiki.org/wiki/Extension:GrowthExperiments/Technical_documentation/Campaigns/Creation_of_customized_landing_pages
Previous similar task: T331495: Quick Analysis: Thank You Pages: custom account creation pages for sv, it, ja, fr, nl Previous research: Newcomer Experience Pilot Project- Thank You Pages and Thank You Banners Previous analysis of this campaign: T352116: Thank You page experiment: First week analysis of English Wikipedia experiment
Review accounts created with the campaign parameters typage-6C-en-2023 and typage-6C-IAD-2023
Provide the following metrics:
In this case we're analyzing the effects of these campaigns for different time periods.
Thanks! Moving to blocked until the banner campaign is over.
I've collected data and calculated the statistics for both of the campaigns using the date/time ranges listed in the task description.
For the typage-6C-en-2023 campaign we get the following statistics (see the Definitions section below for metric definitions):
These numbers are very similar to those calculated after one week as posted in T352116#9387844 . In these final statistics we have a slightly higher desktop registration rate (36.7% as compared to 36.0%) and a slightly lower mobile web registration rate (36.5% versus 37.9%). These differences can point to changes in user behaviour over the holiday period, e.g. that vacation time means desktop users registered during the later stages of the campaign might be more likely to be home rather than at work.
I've also calculated the revert rate of the edits made by these users, as well as what proportion of their edits came through the Suggested Edits module on the Newcomer Homepage.
When it comes to the revert rate across all edits, with the knowledge that contribution amounts vary greatly between users, the revert rate is 8.9% out of <900 edits (we're not reporting specific numbers per our Data publication guidelines ). The rate varies substantially by platform, on desktop it's 6.8% out of <450 edits, while on mobile web it's 11.0% out of <450 edits. Compared to the one-week statistics these are about 2 percentage points higher across the board. That being said, these revert rates are still much lower than those of typical newcomers.
The low revert rate might be a result of the high proportion of Suggested Edits these newcomers make. Overall the proportion is 61.8% out of <900 edits. The rate is lower on desktop (53.6% out of <450 edits) than on mobile web (70.0% out of <450 edits). These proportions are very similar to what we saw one week after deployment and indicates that the rate has been stable across the campaign.
The typage-6C-IAD-2023 campaign was deployed towards the end of the year, meaning this is the first set of statistics gathered for that campaign. It also uses a banner, which means that we are likely to see very different behaviour patterns. We've also run Thank You page- and banner-based campaigns in 2021/22 and there noticed the differences in patterns.
The registration rate on the desktop platform (3.0%) is comparable to the overall registration rate of our previous banner-based campaign (3.7%), while the mobile web rate is a lot lower. I don't have a hypothesis for why the mobile web rate is so much lower, although in previous campaigns we've often ended up asking whether there's a mismatch in user expectation. When it comes to activation rates, the desktop rate (17.1%) is in line with the overall rate of the previous campaign (16.8%). The mobile web rate comes in somewhat lower (13.2%), but I don't think that's a substantial difference.
We've also investigated the revert rates and Suggested Edit rates for these users. As before, we're not reporting specific numbers. The revert rate for these newcomers is on par with or slightly higher than that of the average newcomer. Overall it's 28.6% out of <300 edits, and it's slightly lower on desktop (28.3% out of <100 edits) than on mobile web (28.8% out of <200 edits). This might be because these accounts have specific edits in mind when they visit Wikipedia, because the Suggested Edits rate is much lower than that of the Thank You page campaign (61.8% as reported above). Overall, the Suggested Edits rate is 44.4%, with it being slightly higher on desktop (44.6%) and somewhat lower on mobile web (42.9%).
Thank you @nettrom_WMF ! I've added a summary of this analysis here: https://www.mediawiki.org/wiki/Growth/Newcomer_experience_projects#Scaling_the_new_donor_Thank_you_page_to_English_Wikipedia
Please review and feel free to add to this summary.
I believe we can consider this task resolved. Thanks!