Archive Update

  • Planned Archive downtime: RAM upgrade

    By .Lucy Pearson on Tuesday, 17 July 2012 - 8:44am
    Message type:

    The Archive of our Own will be down for planned maintenance for approximately three hours from 15.00 UTC on Friday 20 July (see what time this is in your timezone). During this period we'll be installing some new RAM and performing some other maintenance (more details below for the curious!).

    We'll keep users updated on our Twitter AO3_Status as the work progresses. Thanks for your patience while we complete this work!

    New RAM

     cartoon style image of server
    Our database server looking grumpy about having too little RAM!

    We're doubling the RAM in our database server and in our two application servers. Increasing RAM will help our system cope with more users: for example, it will allow us to run more unicorn workers, which serve up the content you're trying to access. This should help site performance as the site expands.

    You can imagine the unicorns lining up in the hall of RAM to fetch you things from the treasure trove of fanworks: if there aren't many unicorns, you have to wait till one can serve you, which sometimes means you get a 502 error. We can increase the number of unicorns to make things go faster for you, but if the hall is too small (there isn't enough RAM) then things get crowded and inefficient and everything slows down again. More RAM allows us to increase the number of unicorns without slowing things down. (For the interested, this more technical explanation of Unicorn isn't exactly the way things are set up on the AO3, but will give you an idea.)

    New drives

    We're also installing some new drives in our two oldest machines. Both these machines have room for six drives; currently they each have four installed. Information is mirrored on the drives so that if one goes down, the system continues to work. At the moment, one machine has a broken drive. We'll be replacing the broken drive, and at the same time adding two spares to both machines so that we have more backups if anything else breaks.

    Our two original machines preparing to nom their new drives
     cartoon style image of server cartoon style image of server

    Mirrored from an original post on the Archive of Our Own.

  • My, how we've grown! A few AO3 stats

    By .Lucy Pearson on Monday, 16 July 2012 - 4:12pm
    Message type:

    We've been talking a lot recently about how much the AO3 has expanded over the last few months. One easy statistic for us to lay our hands on is the number of registered accounts, but this only represents a tiny portion of site activity. Our awesome sys-admin James_ has been doing some number crunching with our server logs to establish just how much we've grown, and provided us with the following stats (numbers for June not yet available). Thanks to hele for making them into pretty graphs!

    Visitors to the AO3

    Line graph showing the number of visitors to the AO3 per month, December 2010 to May 2012. The line progresses steadily upwards with a significant spike from 1,197,637 in April 2012 to 1,409,265 in May 2012.

    The number of unique visitors to the site has increased almost every month since December 2010 (each unique IP address is counted as one visitor). There are a few points where the rate of increase gets more dramatic: there was a jump of 244,587 across December 2011 and January 2012, compared to one of 137,917 over the two months before that. This can probably be accounted for by the fact that during December and January, holiday challenges such as Yuletide bring more people to the site. This theory is borne out by the fact there was a slight dip in the number of visitors during February 2012, indicating that some of the extra traffic in the previous two months were 'drive by' visitors who didn't stick around.

    May 2012 saw a steep increase in the number of visitors: there were 211,628 more visitors to the site than there had been the month before! The rapid increase in visitors was not without its price: this was the month of many 502 errors!

    Traffic to the AO3

    Line graph showing AO3 traffic in GB per month, December 2010 to May 2012. The line progresses steadily upwards with a significant spike from 2192 GB in April 2012 to  2758 GB in May 2012.

    The increase in the number of visitors to the site has also been accompanied by an increase in overall site traffic (how much data we're serving up). Again, there's a significant spike during December/January. Interestingly, there's no dip in traffic for February 2012, showing that even though there were some 'one time' visitors over the holiday period, there were also plenty of people who stayed and continued to enjoy fanworks on the site.

    The increase in traffic to the site clearly accelerated in 2012. Between January and May 2011 traffic increased by just 159.92 GB; the same period in 2012 saw an increase of 1,870.26 GB! In fact, with an increase of 566 GB during May 2012, that month alone saw almost as big a jump in traffic as the whole of the previous year (595.63GB)!

    And the other stuff

    With these kinds of numbers, it's not surprising that there've been a few bumps along the way. For information on how we're dealing with the growth in the site you can check out our posts on performance and growth and accounts and invitations.

    Many thanks to our dedicated volunteers for their hard work dealing with the growth of the site, and to our fabulous users for their patience with our growing pains - and for creating the awesome fanworks so many people are flocking here to see!

    Mirrored from an original post on the Archive of Our Own.

  • AO3 Announcement: Disabling filters: Information and search tips

    By .Lucy Pearson on Wednesday, 13 June 2012 - 10:29am
    Message type:

    Key information: As an emergency measure to deal with recent performance issues, we will shortly be disabling the browsing filters on the Archive of Our Own (the grey box of choices which appears on work index pages). This is a temporary measure to ensure that as many people as possible can access the site. You can still use our tags and advanced search feature to find the works you want. Read on for more information!

    Key information: As an emergency measure to deal with recent performance issues, we have disabled browsing filters on the site (the grey box of choices which appears on work index pages). This is a temporary measure to ensure that as many people as possible can access the site. You can still use our tags and advanced search feature to find the works you want. As an additional bonus, removing the filters has allowed us to remove the 1000 works cap on lists of works, so you can browse through all the works in your fandom! Read on for more information!

    What's happening

    As detailed in our recent post on performance, our coders and sys-admins are continuing to work on the performance issues we've been experiencing. We've made some server adjustments which have alleviated some of the worst problems, but we still need to make some substantial changes to fix the issues. We're aware that lots of users are still unable to access the site; as an emergency measure, we've decided to disable tag filters, which put a very heavy load on our servers. This means that the grey box with tags you can check to filter a list of works will no longer appear on the work index pages. We know this will be an inconvenience for many users, but the filters are really the 800-pound gorilla sitting on top of our database. Removing them for now will mean that people can access the site, even if they can't browse quite as easily as usual.

    We've been working on significantly redesigning the part of our code that handles filtering for a while - because it's a major performance burden on some of the most popular pages of the site, refactoring this code to make it more efficient has been a priority for some time now. We're almost done with the rewritten version, but it needs more work and extended testing before we roll it out. (We want to be sure it doesn't introduce new bugs.) So, the filters will go away for a few weeks, and will then be replaced by the new, rewritten version.

    One major disadvantage of the way the filters were designed was that they needed to retrieve the tags from the list of works found in order to build the filter options. This meant that we had to limit the number of works returned at one time to 1000, because otherwise building the filters would take too long. A side bonus of removing the filters is that we've been able to remove the 1000 works cap! The browsing redesign in progressaims to work around this issue, so we hope to avoid re-introducing this limitation when filtering returns.

    How can I find the works I want?

    Although the removal of the filters will make it harder to browse the works listings for specific things, there are still lots of ways to find the works you need.

    Fandoms page

    If you're looking for a specific fandom, you can browse the Fandoms page. Fandoms are organised by media type; the easiest way to find a particular fandom is to use Ctrl + F (or Command + F on a Mac) to search the page in your browser. The fandom pages will give you a list of all the works in your fandom; unfortunately there will be no way to filter that list down further.


    Clicking on any tag will still bring up works with that tag, or with any tag marked as a synonym. So, if you click on Riza Hawkeye you'll get all the works tagged with 'Riza Hawkeye', 'Riza', 'Riza is awesome', etc. Again, while the filters are disabled there'll be no way to filter this list further.

    Advanced Search

    If you want more refined control over which works you find, you will need to use our Work Search. This feature could use a little bit of prettifying, but the underlying search is quite powerful. Use the following tips to help you find exactly the works you want:

    • A space equals AND. So, entering Fluff Friendship would find you works tagged with both 'fluff' and 'friendship'
    • | equals OR. So, entering Homestuck | My Little Pony will find you works tagged with 'Homestuck' AND/OR 'My Little Pony'
    • - equals NOT. So, entering Supernatural - Castiel/Dean Winchester will find works tagged Supernatural, but will exclude those tagged Castiel/Dean Winchester.
    • Fandom, Character, Relationship, Rating, Category, and Warning are all classed as tags (as well as the 'Additional tags'). So, you can search for works which are Explicit, or exclude works tagged 'Major Character Death'.
    • Using quotes around a phrase will search for that exact phrase. So, "Harry Potter" will get works tagged with 'Harry Potter', whereas Harry Potter will get works tagged with 'Harry' and works tagged with 'Potter'.
    • Entering a term in the tag field will only find works with exactly that tag - so searching for Charles/Erik will bring up only the few works tagged with exactly that tag, not the ones tagged 'Erik Lehnsherr/Charles Xavier' (whereas if you click on the 'Charles/Erik' tag you'll get works with all variations of that pairing).
    • The search has trouble with tags which have dashes in them. If you search for X-Men, for instance, you noticed you'll get lots with X and no X-Men. To get around this, put the tag in quotes: "X-Men".

    As well as searching tags, titles, and authors, you can also search for specific word counts, hits, kudos, and dates - including ranges, which is a useful tool for finding fics in a fandom. For example, you can search for all Stargate Atlantis fics published 5-6 years ago.

    Some search examples!

    • Find an explicit Fullmetal Alchemist work with the pairing Riza Hawkeye/Roy Mustang, with no Archive Warnings: Enter "Fullmetal Alchemist" "Riza Hawkeye/Roy Mustang" "No Archive Warnings Apply" Explicit.
    • Find works with Rodney McKay but without John Sheppard: Enter "Rodney McKay" -"John Sheppard".
    • Find works tagged with "Alternate Universe" in either the Homestuck or White Collar fandoms: Enter "Alternate Universe" Homestuck | "White Collar".
    • Find all explicit works tagged as angst, but excluding M/M pairings: Enter Angst Explicit -M/M

    Search bookmarklets

    If you find yourself re-using the same search parameters (only T-rated works, only works under 5,000 words, only works with over 10 kudos) for new fandoms or characters you fall in love with, you could give these custom search bookmarklets a try. They are not official AO3 tools, but made by one of our own and utilizing the Advanced Search functionality. Think of them as a saved search that lets you enter a keyword (such as a fandom name or specific kink) and spits out only the kind of work you want to see. For help in putting together your own saved search, don't hesitate to comment on the post or here.

    What next?

    This is definitely a short term measure, but we think it will have a big effect on site performance. In a few weeks we hope to deploy our all new search and browse features, which will restore more browsing functionality without placing the same load on the servers. We thank you for your patience and understanding while we continue to work on the problem areas.

    Mirrored from an original post on the Archive of Our Own.

  • Update on AO3 performance issues

    By .Lucy Pearson on Monday, 11 June 2012 - 12:10pm
    Message type:

    Since last month, we've been experiencing frequent and worsening performance problems on the Archive of Our Own as the site has expanded suddenly and dramatically. The number of new users joining the site doubled between April and May, and we currently have over 17,000 users waiting for an invitation. We've been working hard to deal with the 502 errors and site slowdowns, and we've implemented a number of emergency fixes which have slightly alleviated the issues, but these haven't been as effective as we'd hoped. We're confident that we will be able to fix the problems, but unfortunately we expect the next round of fixes to take at least two weeks to implement.

    We know that it's really frustrating for users when the site is inaccessible, and we're sorry that we're not able to fix the problems more quickly. We wanted to give you an update on what's going on and what we're doing to fix it: see below for some more details on the problems. While we work on these issues, you should get better performance (and alleviate the load on the servers) by browsing logged-out where possible (more details below).

    Why so many problems?

    As we mentioned in our previous post on performance issues, the biggest reason for the site slowdowns is that site usage has increased dramatically! We've almost doubled our traffic since January, and since the beginning of May the pace of expansion has accelerated rapidly. In the last month, more than 8,000 new user accounts were created, and more than 31,000 new works were posted. This is a massive increase: April saw just 4,000 new users and 19,000 new works. In addition to the growing number of registered users, we know we've had a LOT more people visiting the site: between 10 May and 9 June we had over 3,498.622 GB of traffic. In the past week, there were over 12.2 million page views - this number only includes the ones where the page loaded successfully, so it represents a lot of site usage!

    This sudden and dramatic expansion has come about largely as a result of changes on, who have recently introduced more stringent enforcement of their policies relating to explicit fanworks which have resulted in some fans no longer being able to host their works there. One of the primary reasons the AO3 was created was in order to provide a home for fanworks which were at risk of deletion elsewhere, so we're very keen to welcome these new users, but in the short term this does present us with some challenges!

    We'd already been preparing for site expansion and identifying areas of the site which needed work in order to ensure that we could grow. This means some important performance work has been ongoing; however, we weren't expecting quite such a rapid increase, so we've had to implement some changes on an emergency basis. This has sometimes meant a few additional unexpected problems: we're sorry if you ran into bugs while our maintenance was in progress.

    What we've done so far

    Our sys-admins and coders have implemented a number of things designed to reduce the load on the site over the last week:

    • Implemented Squid caching for a number of the most performance intensive places on the site, including work index pages. For the biggest impact, we focused on caching the pages which are delivered to logged-out users. This is because all logged-out users usually see the same things, whereas logged in users might have set preferences (e.g. to hide warnings) which can't be respected by the cache. We initially implemented Squid caching for individual works, but this caused quite a few bugs, so we've suspended that for now while we figure out ways of making it work right. (You can read more about what Squid is and what it does in Release Notes 0.8.17.
    • Redistributed and recalibrated our unicorns (which deliver requests to the server and retrieve the data) to make sure they're focused on the areas where we need them most. This included setting priorities on posting actions (so that you're less likely to lose data when posting or commenting), increasing the numbers of unicorns, and adjusting the time they wait for an answer.
    • Simplified bookmark listings, which were using lots of processing power. We'll be looking into revamping these in the future, but right now we've stripped them back to the basics to try to reduce the load on the site.
    • Cached the listing of guest kudos so the number doesn't have to be fetched from the database every time there are new kudos (which caused a big strain on the servers)

    Implementing these changes has involved sustained work on the part of our sys-admins, coders and testers; in particular, the Squid caching involved a great deal of hard work in order to set up and test. Several members of the team worked through the night in the days leading up to the weekend (when we knew we would have lots of visitors) in order to implement the performance fixes. So, we're disappointed that the changes so far haven't done as much as we'd hoped to get rid of the performance problems - we were hoping to be able to restore site functionality quickly for our users, but that hasn't been possible.

    What we're going to do next

    Although the emergency fixes we've implemented haven't had as much impact as we'd hoped, we're confident that there are lots of things we can do to address the performance problems. We're now working on the following:

    • New search and browse code. As we announced in our previous post on performance issues, we've been working for some time on refactoring our search and browse code, which is used on some of the most popular pages and needs to be more efficient. This is almost ready to go -- in fact, we delayed putting it onto our test archive in order to test and implement some of the emergency fixes -- so as soon as we have been able to test it and verify that it's working as it should, then we will deploy this code.
    • More Squid caching. We weren't able to cache as many things as we'd initially hoped because the Squid caching threw up some really tricky bugs. We're continuing to work on that and we'll implement more caching across the site once we've tested it more thoroughly.
    • More servers. We're currently looking at purchasing a more robust database server and moving our old database server (aka 'the Beast') into an application slot, giving us three app servers. We'll also be upgrading the database software we use so that we can make the most of this server power.

    When we'll be able to implement the fixes

    We're working as fast as we can to address the problems -- we poured all our resources into the emergency fixes this week to try to get things up and running again quickly. Now that we've implemented those emergency fixes, we think that we need to focus on making some really substantive changes. This means we will have to slow down a little bit in order to make the bigger changes and test them thoroughly (to minimise the chances of introducing new bugs while we fix the existing problems). Buying servers will also take us some time because we need to identify the right machines, order them and install them. For this reason, we expect it to take at least two weeks for us to implement the next round of major fixes.

    We're sorry that we're not able to promise that we'll fix these problems right away. We're working as hard as we can, but we think it's better to take the time to fix the problems properly rather than experimenting with lots of emergency fixes that may not help. Since the AO3 is run entirely by volunteers, we also need to make sure we don't burn out our staff, who have been working many hours while also managing their day jobs. So, for the long term health of the site as a whole, we need to ensure we're spending time and resources on really effective fixes.

    Invitations and the queue

    As a result of the increasing demand for the site, we're experiencing a massive increase in requests for invitations: our invitations queue now stands at over 17,000. We know that people are very disappointed at having to wait a long time for an invitation, and we'd love to be able to issue them faster. However, the main reason we have an invitations system for creating accounts is to help manage the growth of the site -- if the 16,000 people currently waiting for an invitation all signed up and started posting works on the same day the site would definitely collapse. So, we're not able to speed up issuing invitations at this time: right now we're continuing to issue 100 invitations to the queue each day, but we'll be monitoring this closely and we may consider temporarily suspending issuing invitations if we need to.

    Until recently, we were releasing some invitations to existing users who requested them. However, we've taken the decision to suspend issuing invitations this way for the present, to enable us to better monitor site usage. We know that this will be a disappointment to many users who want to be able to invite friends to the site, but we feel that the fairest and most manageable way to manage account creation at present is via the queue alone.

    What can users do?

    We've been really moved by the amount of support our users have given us while we've been working on these issues. We know that it's incredibly annoying when you arrive at the Archive full of excitement about the latest work in your fandom, only to be greeted by the 502 error. We appreciate the way our users have reached out to ask if they can help. We've had lots of questions about whether we need donations to pay for our servers. We always appreciate donations to our parent Organization for Transformative Works, but thanks to the enormous generosity fandom showed in the last OTW membership drive, we aren't in immediate need of donations for new servers. In fact, thanks to your kindness in donating during the last drive, we're in good financial shape and we're able to buy the new server we need just as soon as we've done all the necessary work.

    As we've mentioned a few times over the weekend, we can always use additional volunteers who are willing to code and test. If this is you or anyone you know, stop by Github or our IRC chat room #otw-dev!

    There are a few things users can do when browsing which will make the most of the performance fixes we've implemented so far. Doing the following should ease the pressure on the site and also get you to the works you want to see faster:

    • Browse while logged out, and only log in when you need to (e.g. to leave comments, subscribe to a work, etc). Most of our caching is currently working for logged-out users, as those pages are easier to cache, so this will mean you get the saved copies which come up faster.
    • Go direct to works when you can - for example, follow the feeds for your favourite fandoms to keep up with new works without browsing the AO3 directly, so you can click straight into the works you like the sound of.

    Support form

    Our server problems have caused some problems accessing our support form. If you have an urgent query, you can reach our Support team via the backup Support form. It's a little more difficult to manage queries coming through this route, so we'd appreciate it if you'd avoid submitting feature requests through this form, to enable us to keep on top of bug reports. Thanks!

    Thank you

    We'd like to say a big, big thank you to all our staff who have been working really hard to address these problems. A particular shoutout to James, Elz, Naomi and Arrow, who have been doing most of the high level work and have barely slept in the last few days! We're also incredibly grateful to all our coders and testers who have been working on fixing issues and testing them, to our Support team, who have done an amazing job of keeping up with the many support tickets, and to our Communications folk who've done their best to keep our users updated on what's going on.

    We'd also like to say a massive thank you to all our users for your incredible patience and support. It means so much to us to hear people sending us kind words while we work on these issues, and we hope we can repay you by restoring the site to full health soon.

    A note on comments: We've crossposted this notice to multiple OTW news sites in order to ensure that as many people see it as possible. We'll do our best to keep up with comments and questions; however, it may be difficult for us to answer quickly (and on the AO3, the performance issues may also inhibit our responses). We're also getting lots of traffic on our AO3_Status Twitter! Thanks for your patience if we don't respond immediately.

  • AO3 performance issues

    By .Lucy Pearson on Sunday, 3 June 2012 - 3:38pm
    Message type:

    As pretty much all users of the Archive of Our Own have no doubt noticed, we've been experiencing some problems with Archive loads: slowdowns and the appearance of the dreaded 502 page have become a regular occurrence. We're working on addressing these issues, but it's taking longer than we'd like, so we wanted to update you on what's going on.

    Why the slowdowns?

    Mostly because there's so much demand! The number of people reading and posting now is overwhelming - we're glad so many people want to be here, but sorry that the rapid expansion of the site is making it less functional than it should be.

    We now get over a million and a half pageviews on an average day, often clustered at peak times in the evening (particularly when folks in the Western Hemisphere are home from work and school) - we were using a self-hosted analytics system to monitor site traffic, and we had to disable it because it was too overloaded to keep up. The traffic places high demands on our servers, and you see the 502 errors when the systems are getting more requests than they can handle. Ultimately we'll need to buy more servers to cope with rising demand, but there's ongoing work that we've done and need to continue to do to make our code more efficient. We've been working on long-term plans to improve our work and bookmark searching and browsing, since those are the pages that get the most traffic; right now, they present some challenges because they were designed and built when the site was much smaller. We've learned a lot about scaling over the years, but rewriting different areas of the code takes some time!

    What are you doing to fix it?

    Our Systems team are making some adjustments to our server setup and databases. Their first action was to increase the amount of tmp space for our MySQL database on the server - this has alleviated some of the worst problems, but doesn't really get at the underlying issues. They're continuing to investigate to see if there are additional adjustments we can make to the servers to help with the problems.

    We're also actively working on the searching and browsing code: that's been a big project, and it will hopefully make a significant impact. Because it affects a lot of crucial areas of the site, we want to make sure we get everything right and do as much testing as we can to ensure that performance is where it needs to be before we release it. We're switching from the Sphinx search engine to elasticsearch, which can index new records more rapidly, allowing us to use that for filtering. That will offer us more flexibility, get rid of some of our slower SQL queries, and take some pressure off our main database, and it also has some nice sharding/scaling capabilities built in.

    We also try to cache as much data as we can, and that's something we're always looking to improve on. Systems and AD&T have discussed different options there, and we'll be continuing to work on small improvements and see what larger ones we may be able to incorporate.

    When will it be fixed?

    It's going to take us a few weeks to get through all the changes that we need to make. Our next code deploy will probably be within the next week - that will include email bundling of subscription and kudos notifications, so that we can scale our sending of emails better as well. After that, we'll be able to dedicate our resources to testing the search and browsing changes, and we're hoping to have that out to everyone by the end of June. We rely on volunteer time for coding and testing, so we need to schedule our work for evenings and weekends for the most part, but we're highly motivated to resolve the current problems, and we'll do our best to get the changes out to you as soon as we can.

    Improving the Archive is an ongoing task, and after we’ve made the changes to search and browse we’ll be continuing to work on other areas of the site to enable better scalability. We’re currently investigating the best options for developing the site going forward, including the possibility of paying for some training and/or expert advice to cover areas our existing volunteers don’t have much experience with. (If you have experience in these areas and time to work closely with our teams, we’d also welcome more volunteers!)

    Thanks for your patience!

    We know it's really annoying and frustrating when the site isn't working properly. We are working hard to fix it! We really appreciate the support of all our users. ♥

    Mirrored from an original post on the Archive of Our Own.

  • Enter the Wrangulator: Tag Wrangling Open House 22nd April

    By .Lucy Pearson on Tuesday, 17 April 2012 - 11:37am
    Message type:

    Tag clud representing a variety of tags used on the Archive of Our Own, together with a stylised version of the Archive logo designed to look like a confused face, scratching its head.

    Have you ever wondered about what it is tag wranglers do? Are you thinking about volunteering as a wrangler? Do you have a question about tags on the Archive of Our Own? Is your fandom in need of some temporary assistance? The Tag Wrangling Committee is hosting their second open house! This is a drop-in session where you can ask us what's on your mind, or just have a chat about tags.

    All are welcome! The chat will be held on Sunday, 22nd of April, 2012, from 19:00 to 21:00 UTC
    (see when this is in your timezone) in OTW's public chatroom on Campfire. The chatroom can be accessed at: (Please note: This url has changed since this post was originally posted! Apologies for any confusion.) Feel free to drop by at any time during the session to ask questions or just to hang out.

    Additional Tag Wrangling Open Houses are planned for July and October. If you can't make this one, never fear - we'll be holding future sessions at different times to make it easier for people in different timezones to attend.

    The Tag Wrangling Committee and their team of volunteer “Tag Wranglers” maintain and administer the tags on the Archive of Our Own, curating the folksonomy system that links related tags together for better filtering and searching, while allowing users to tag their works however they prefer.

    Mirrored from an original post on the Archive of Our Own.

  • Rush hour on the AO3!

    By .Lucy Pearson on Friday, 6 January 2012 - 9:03am
    Message type:

    As many of you will have noticed, we had some site slowdowns and 502 errors on the Archive of Our Own over the first couple of days of the year. Apologies for the inconvenience! If you've run into this problem and been wondering what was going on, you might be interested in this:

    Line graph of the last visits on the AO3, 4 December to 3 January. The graph peaks sharply on 1st January

    Yes, it looks like lots of fans decided to celebrate the New Year with some delicious fanworks. On Monday 2nd January we had 182,958 visits, and over 1,066,216 pageviews! Furthermore, an octopus swam off with our servers - volta_arovet's Texts From Cephalopods has had 46,301 hits at the time of writing! So, our servers had plenty of work to do!

    Over 2000 new users have joined the Archive in the last couple of weeks, and we have hosted several great challenges, including Yuletide (2598 works!), Due South Secret Santa (a more modest 34 works), and Homestuck Ladyfest Exchange (124 works). So, while we're sorry to have had some slowdowns, overall we are super pleased with how well our shiny new servers have held up - those of you who were with us during the holiday season in previous years will remember that the high traffic of holiday challenges made our old servers very sad.

    Looking forward, we're not too worried about performance in the immediate future - there are some code improvements we know we need to make which will improve matters a lot, so those will be high priority. If the AO3 continues to expand at the same rate as this year, we will be looking at more servers sooner rather than later. But in light of the graph above, we're pleased that while we certainly slowed down, we didn't grind to a halt! Thanks to all the coders and sysadmins who did the work to make this possible, and thank you to all the OTW members whose donations helped us buy those hardworking servers (we are always grateful for volunteers or donations)! And, of course, thanks to everyone who reads and posts on the AO3 - we're excited to welcome so many of you!

    Once again, apologies to those of you who have been affected by the slowdowns - but hurray for so much beautiful fannish activity!

    Mirrored from an original post on the Archive of Our Own.

  • AO3 - 2011 in review!

    By .Lucy Pearson on Saturday, 31 December 2011 - 9:51pm
    Message type:

    2011 was an amazing year for the Archive of Our Own, and we wanted to take a moment to look back and to thank everyone involved, including all of our users and volunteers! AO3 started its open beta about two years ago, towards the end of 2009. That year, we were really still putting the pieces together, building out the core functionality. In 2010, we started to pick up more momentum with people posting their works and archiving their older fic and art. We added gift exchange challenge hosting, kudos, downloads and skins. This year, we've done a lot of work on site performance and infrastructure, usability improvements, and new features like subscriptions and prompt meme challenges. We're looking forward to expanding on that next year and continuing to build a great, stable home for all kinds of fanworks!

    Traffic and performance

    A drawing of our seven machines!

    At the beginning of the year, we moved to a new and bigger set of servers, which gave the site some much-needed room to grow. Our systems team made some tweaks along the way, ensuring that we were getting the best performance out of the new setup. We started using Redis, which is super-fast, for email queues, autocompletes and other background tasks, which took some of the load off our main database. And even with all the work we were doing, it was tough to keep up with how fast the site was growing! 2/3rds of our current registered users signed up this year, and we kept giving out more and more invitations through our invite queue, but the numbers kept climbing - there were over 2,000 people on the waiting list for several months. (We've finally gotten that down now, just by sending out even more as the system could handle it.) And many more site visitors aren't registered users - we now get well over half a million unique visitors each month and there have been 24+ million pageviews in December. We now get as much traffic on an average day as we did last year during Yuletide, which at the time was a huge traffic spike. The period around Christmas, with Yuletide and other holiday exchanges going live, still represents a noticeable jump in traffic, but the difference isn't as great which means more stable site performance. (\o/ We were standing by with fingers crossed just in case, but we were thrilled that no last-minute work was required this year!)

    Fun with charts!

    AO3 is currently home to over 8,100 fandoms, 31,000 users and 275,000 works! Here's a graph of work, chapter, bookmark and comment posting over the last three years:

    You can see that work posting is up this year, but what's much more dramatic is the increase in reading, bookmarking and commenting. There have also been more multi-chapter works and works-in-progress posted this year, which is exciting. And one of the neat features the archive has is the ability to go back and see what you've read or viewed, for registered users who have it enabled. Here's how that looks year-to-year:

    Lots of people viewing lots of stuff! There have also been almost 1.5 million kudos left since last year, so there's been no shortage of love to go around. <3

    What's on deck for 2012

    In the short term, we have a new release coming out hopefully early this month, and that will include improvements to our HTML parser (yay!), some exciting new subscription options and a variety of bugfixes. There are also a ton of other features and improvements that we've been developing this year that we hope to have ready for you in 2012, including the ability to view the site in other languages, art hosting, an on-site support area and a wealth of browsing, filtering and email improvements for both works and bookmarks. We also hope to start a series of international fandom spotlights in January and solicit more input from users about upcoming features.


    And finally: thank you! Thanks to all of the authors, artists, and vidders who have posted their works, to the mods who organize challenges and collections, to those who have shared skins for customizing the site, to everyone who creates bookmarks and leaves comments and kudos, encouraging authors and artists and making it easier for other fans to find awesome works. Thanks to everyone for bearing with our growing pains earlier in the year and for supporting AO3 financially, enabling it to continue operating and improving. And many thanks, as always, to everyone who volunteers their time wrangling tags, writing code, testing the site, handling support requests, and maintaining our systems, and also to everyone who's left comments and written in to our support team with feedback, suggestions, and bug reports, all of which are incredibly valuable! The archive is very much a community effort, and it couldn't exist without all of us working together and supporting one another.


    Mirrored from an original post on the Archive of Our Own.

  • Accessibility, Design and Technology Meeting - 3 December 2011

    By .Lucy Pearson on Friday, 23 December 2011 - 2:00am
    Message type:

    The Accessibility, Design and Technology committee oversees technology-related projects within the OTW. Currently we are responsible for designing and building the Archive of Our Own. Our regular meeting updates keep you informed about developments on the AO3!

    This was our final meeting of the year: the OTW takes an end-of-year break during which committees dissolve and are reformed, and committee members take a well-earned rest! We've had an action-packed year, so we're all ready for a break (from meetings at least - a lot of our work goes on as usual). We'll resume in January - we don't take on any new volunteers during our hiatus, so if you volunteer between now and then (and we hope you will - as you can see below, there are several areas we're really keen to build up), you'll have to wait a little while to get started.

    Meeting highlights!

    Fandom landing pad!

    AD&T co-chair and senior coder Elz has been working for a while on improvements to browsing on the Archive. One thing she's been working on is a new 'fandom landing pad' so that browsing to a fandom will give you the option to browse to some important areas relating to that fandom. In this meeting we previewed her new design - going to a fandom landing page will give you a list of pairings and relationships in the fandom, and a list of authors and artists who have created work in that fandom, along with some basic information about the canon source. It's not quite ready for primetime yet, but it's looking very nifty!

    Issues for Love!

    Issues for love are the requests submitted by users via Support. We try to work through a few of these each meeting: we're working on ways of making it easier for people to see what has been suggested and what has been decided about the suggestions, but for now we'll include a round-up of our discussions in our meeting updates. Note that a decision to implement something does not necessarily mean it will be implemented soon - we have many issues to work on and a limited number of coders! If you want to see the full (and lengthy!) list of things logged for coders to work on you can check out our Google Code Issues. If you'd like to adopt an issue, we welcome new coders!

    • Request to add a setting to prompt-meme challenges to disallow anonymous prompts. This seemed like a handy extra feature without too much coding complexity, so we have logged it as an issue for a coder to work on.
    • Request to add an option to hide 'Share' buttons on works to reclaim screen real estate. It's already possible to disallow use of the share button on your own works, but you still see the button. We sympathise with the desire to reclaim the screen real estate, but we decided that added a user preference to hide the buttons would add too much complexity (the more user preferences there are, the more complicated it becomes for people to figure out what they can set in preferences, so we try to limit the options to things where there is a lot of demand for a setting). Instead, we added some extra code to our buttons so that they can be selected with CSS, so that people can build skins which hide the 'Share' button (or indeed any other button).
    • Suggestion for a 'challenge calendar' listing opening and closing dates for challenge sign-ups, and dates for assignments due, works revealed, authors revealed, etc, which can be opted in when a mod creates the challenge. We loved this idea, but it is fairly complex to implement. Our lovely co-chair Amelia has volunteered to put together a design, so this is something we'll introduce in the future - but probably not for a while.
    • Request for a way to mark WIPs as abandoned, and a way to offer abandoned WIPs up for 'adoption' so that someone can finish them. We all agreed it would be really nice to have a quick way to flag that a WIP would never be finished, so we've logged that as an issue for a coder to implement. The idea of offering works up for adoption seems like it might have more limited appeal, so we agreed that for now, it would be better to leave this as something which people can simply indicate in the tags they use, if so desired (you can add 'Adopt this story' or indeed any other tag you wish as an additional tag to your work).

    Reflecting on Release 0.8.9

    As most of you reading this will know, we had a big release of new code at the beginning of November. This release included a lot of exciting new stuff; unfortunately, it didn't go as smoothly as we had hoped. In this meeting we reflected on problematic areas and ways that we can improve in future:

    • We combined two big new features: the redesign of our front-end code and the new tag sets code for challenges and collections. We had decided to combine the two because the tag sets needed some front-end work anyway, and at the time we made the decision it made sense to roll the two things into one. However, the tag sets code was time sensitive: because it offers a new system of challenge nominations which significantly reduces the pressure on tag wranglers, we wanted to implement it in time for the big holiday challenges such as Yuletide. This meant that when we combined the two features, we had a lot more stuff to get ready within a set amount of time, which made everything more difficult. When we decided to merge the two, it didn't seem as if this was going to be a problem - but one thing we've learnt is that any deploy can bring unexpected hitches, so in the future if there's a time-sensitive feature we'll be trying to keep that as separate from other code as possible.
    • This was a big visual change, which meant that it had an impact on a large number of users: visual bugs tend to be encountered by lots of users, and even if there are no bugs, people still have lots of feedback about visual changes. We were aware of this; however, given the scale of the response to this deploy we realise we weren't prepared enough. We'll be doing more testing of interface changes in future, and exploring ways of beta-testing them with more users.
    • Since one thing about visual changes is that lots of people just prefer the design they are used to, one thing we could have improved on was providing a way of going back to the old default design. We tried to provide for this with the One Point Faux option, but it had quite a few problems. So, in future this is something we'll be paying more attention to: if we introduce a big change, we'll try to provide ways of opting out. The good news is that going forward, this will actually be easier, because the new skins system is much more lightweight and it should be easier to provide some backwards compatibility (one reason this was problematic this time is because the underlying code for the old system was less than ideal, so everything had been completely rewritten).
    • We didn't have as much support documentation and information as we really needed for this deploy. In particular, we needed much fuller documentation on the new skins features so that people could try them out more easily and our Support team could point to useful information when helping people. There were several factors which led to a lack of documentation: crucially, several of the team who would normally take care of this were dealing with RL issues which limited the amount of time they could spend on it. In order to help avoid problems like this in future, we're building a deploy checklist which includes documentation, to make sure that we've considered whether we need additional documentation regardless of who is available to work on any given deploy. We're also aiming to build up a proper documentation team so that this work is less likely to fall through the cracks: if you're interested in being involved in this team, get in touch with our Volunteers and Recruitment Committee and let them know. We'd love to welcome new people to the team!
    • We also needed more documentation on the new features for testers, so that it was clearer what people needed to test and what they should expect it to do. This is an ongoing aim - we're working to improve our documentation across the board. Improving documentation for testers will also help us to address another issue, which was that feedback from testers got a bit scattered - having clear docs to start with would have helped us make it clearer what feedback needed to go where. Again, we're working on building up our testing procedures generally - if you're interested in getting involved with testing, let us know!

    While the problems we had with this deploy did highlight a number of areas where we need to work to improve, it's not all doom and gloom! There were also a number of things that went right with this deploy - we were able to fix critical bugs within 48 hours of the deploy, the Support team did a wonderful job keeping up with the many Support requests, and there was a huge amount of awesome code in the deploy itself. One reason the site is still in beta is that we're still learning the best processes for development (as well as because our code is new and rapidly changing): in the last four years we've gone from being a tiny group working on coding a 'closed' site (i.e. for the first two years we were just writing the code and testing, we didn't have any real users) to being a much larger group catering for a site of over 28800 users! So, we're still figuring things out - objects may still shift in transit! We're pleased that we've been able to keep the site up and running, and everything largely functional, even though we've had the odd bump along the way. Thanks to everyone who has worked hard to make this true!

    Next deploy

    We're hoping to get one last deploy in before the end of 2011! It will include some updates to our HTML parser, some improvements to our static pages for collections and challenges, and Atom feeds for fandom tags! (YEAY!)

    News from our sub-committees

    • Coders have been working on polishing off the issues to go in the next deploy. We're particularly excited about the forthcoming addition of Atom feeds for fandom tags - having tested this out for a good while now on the F/F tag, we think we can implement feeds without too much additional strain on the servers, and since this is a very popular request we're excited about launching it!
    • Testers have been testing the issues for next deploy, and discussing how they'd like to see the subcommittee develop next year. There have been some great discussions on what worked and what didn't this year, how we can build a stronger testing community, and how we can support our testers.

    News from our sister committees

    • Support have continued to work amazingly hard keeping up with a high number of tickets. Looking forward, they're also thinking about our documentation needs and places we need more information for users.
    • Tag wranglers have been discussing needs for next year with AD&T - the two committees will be meeting in the new term to talk over technical needs for tag wrangling. They've also been surveying all tag wrangling volunteers about their experiences this year, with a view to figuring out what works well and what can be improved on.

    If there are things you'd like to do or say, please share them in comments, via the AO3 support and feedback form, by volunteering (we won't be taking on new volunteers until the new term, but you can get in touch now to let us know you're interested), or in whatever medium you feel comfortable with. Everyone is welcome to this party!

    This meeting round-up by Lucy

    Mirrored from an original post on the Archive of Our Own.

  • Accessibility, Design and Technology Meeting - 19 November 2011

    By .Lucy Pearson on Friday, 25 November 2011 - 8:16pm
    Message type:

    The Accessibility, Design and Technology committee oversees technology-related projects within the OTW. Currently we are responsible for designing and building the Archive of Our Own. Our regular meeting updates keep you informed about developments on the AO3!

    AD&T and our associated committees and subcommittees have been very busy recently working towards our latest deploy and then working on issues arising from that. This one didn't go as smoothly as we had hoped (understatement!); we knew that there would be bugs revealed by practical use that didn't appear in testing, though there were more than we had anticipated and we have been working hard to fix the immediate issues. We're happy to say that we were able to fix the most pressing problems within 48 hours; a week on from the deploy we've been able to address quite a few more, so those fixes will be deployed soon. We're really grateful to everyone who worked hard on this deploy and on addressing the issues subsequent to it. We are planning to do a thorough review of the deploy and think about the lessons learned and ways we can improve. This week, however, we focused on working through some outstanding business and outlining the tasks we need to complete before the end of the year.

    Meeting highlights!

    Goals for the rest of the year

    We're drawing close to the end of the 2011 term, so we started to think about what our priorities are for the rest of this year. On December 16th all the OTW committees officially dissolve and we take a break before reforming in January (although in practice many members of AD&T tend to do quite a bit of work during the hiatus, heh). So, we talked about a few things we'd like to get done before then:

    • Coding! Several people have code in-progress which they'd like to do some serious work on and hopefully finish by the end of the year. Site navigation, bookmarks improvements and a refactoring of our works code (important for tons of other improvements) are high on this list!
    • Revising our roadmap. We have a longterm plan for what features we'd like to implement on the Archive and when. However, we don't always implement things in exactly the order they are in on the roadmap: we have to be flexible and adapt according to a range of things including the pressing needs of the Archive at a given time, the coding expertise available, the level of difficulty involved in a specific project (this is not always as anticipated), and a bunch of other things. We also add new things to out to-do list based on feedback from other fans. So, we have to review our roadmap regularly to ensure it reflects our capabilities at any given time: right now it's out of date, so we'd like to get it updated to help shape plans for the coming year.
    • Completing our archive import code and rescuing some at-risk archives! Coder Naomi is currently putting the final touches to the code which will allow us to rescue archives which are no longer able to exist independently. Our immediate priority is the Smallville Slash Archive, which was hosted by the late, great Minotaur: several fans have worked hard to preserve this bit of fannish history, but it can't hold out much longer, so we want to help them out by the end of the year. We talked about the various things we need to do to make that a reality - stay posted for more news on this soon.
    • Reviewing our testing procedures: One area we'd like to improve and develop more support is in our testing team, where a very small number of people do very dedicated work. Our testing lead Kylie will be hosting a meeting for current testers to think about what works well and what can be improved, to make sure they can continue to work well into the future.
    • Exploring our tag wrangling options. The Tag Wrangling Committee and the big team of wranglers they manage do an amazing job at keeping the many, many user-generated tags on the Archive in order. However, we're growing at a massive rate and their job has become significantly bigger over a very short space of time. So, we want to talk to the TW Committee now to see if there are technical improvements that could make their lives easier, if the current system is still right for us, and whether there is anything else to think about from a tech point-of-view.

    Issues for love!

    As part of our recent drive to address features requests and feedback via support more quickly, we had a big drive in this meeting to burn through some of the 'issues for love' which are awaiting committee discussion before they can move on to the next step. A few of the things we discussed:

    • Improving bookmarks: We get lots of support requests asking for more sorting and filtering of bookmarks, and our Support team wanted to know how our plans on this were progressing. We're happy to say this is being actively worked on and we hope to have it out by the beginning of next year.
    • Donate to the Archive: We also get quite a few Support requests asking how to donate time or money to the Archive. We've long been meaning to make this much clearer on the site itself, and we're happy to say that the code for a page with this information has now been submitted by our new intern, Firewolf. If you're wondering in the meantime, both these things are handled via our parent Organization for Transformative Works: get in touch with our Volunteers and Recruitment Committee if you're interested in helping out with the Archive, or make a donation to the OTW to help fund the site. Since the committee term for this year is coming to an end and we'll be taking a break, we won't be welcoming any new people to our teams until we reconvene in January, but we still welcome expressions of interest now!
    • Adding more than one related work: A couple of users had contacted Support to say they had works inspired by more than one other work and couldn't figure out how to show this. It is actually possible to add more than one related work - however, due to an oversight you have to add one, post the work, and then edit the second one in. This is clearly not very intuitive, so we're fixing it - and in the meantime we're adding some help text so people can find the workaround while we look at the more complex bit of the code.

    Next deploy

    The next deploy is scheduled for some time in the next week (depending on the availability of our team, several of whom have holiday celebrations this week). It will include a fix for the rich text editor (currently completely broken for some people), some fixes for oddities in skins, and a fix for index pages on subcollections (currently not showing up!).

    News from our sub-committees

    • Coders worked crazy hard to get our last deploy up and running, and then to fix some bugs arising afterwards. They did an awesome job of coming up with quick solutions to some of the bugs that showed up once we were on the real Archive - thanks to everyone for their hard work! More generally, they have been focusing on getting lots more projects out of the door before the end of the year. They are cooking up some exciting stuff, including navigation improvements and importing, so we're looking forward to having these see the light of day.
    • Testers have been super active lately! They tested the latest deploy in all sorts of situations and configurations, and then did more urgent testing to help fix the things that slipped through the cracks. The testing team is small and they do amazing work - thanks lovely people!

    News from our sister committees

    • Support have also been working very, very hard dealing with all the tickets arising from our recent deploy. Every new deploy produces an uptick in tickets, because new code inevitably means some new bugs (this is why whenever big companies release a new OS, it's usually followed shortly after by a bunch of updates!). This deploy produced more tickets than usual - a lot more! - but Support have been doing a sterling job keeping up. If you do find they are a little slower than usual, then rest assured they will get to you as soon as possible.
    • Tag wranglers have been awesome helping Support deal with tag-related tickets. The Tag Wrangling Committee gave AD&T some initial feedback on where tag wrangling stands at the moment, pending a meeting when we'll talk in more detail about tech needs for wrangling.

    If there are things you'd like to do or say, please share them in comments, via the AO3 support and feedback form, by volunteering, or in whatever medium you feel comfortable with. Everyone is welcome to this party!

    This meeting round-up by Lucy.

    Mirrored from an original post on the Archive of Our Own.


Subscribe to Archive Update