Tag Archives: Rubriq

Rubriq partners with writeLaTeX

writeLaTeX-Rubriq

 

We’re excited to announce a new partnership with another top innovator in tools for researchers, writeLaTeX.  The official press release follows below. If you aren’t familiar with their new Overleaf system, go to www.writelatex.com/overleaf to check it out. It’s an easy-to-use WYSIWYG manuscript editor with real-time collaboration. It produces structured, fully typeset output produced automatically in the background as you type. And no, you don’t have to be a LaTeX user to benefit from either the original writeLaTeX or the new Overleaf system. It’s a great way to manage your developing manuscript, especially with multiple authors/contributors.

 


 

writeLaTeX partners with Rubriq to offer its authors direct access to pre-submission peer review services 

As an independent peer review service, Rubriq looks for opportunities to partner with other innovators in the scholarly publishing industry. WriteLaTeX gives authors an easy way to write and collaborate on their scientific documents through a user-friendly interface called Overleaf which automatically typesets the paper in real-time in the browser.

As a free service that lets you create, edit and share your scientific ideas easily online, writeLaTeX was a natural partner for Rubriq.

Rubriq has been integrated into writeLaTeX as a peer review component that researchers can use before they publish. Once authors have completed work on a paper in the writeLaTeX system, they want to be sure it is ready to publish.  To get the benefit of a journal-quality pre-submission review, they can submit it directly from writeLaTeX to Rubriq and get a critical evaluation by three experts in their field. This gives authors the opportunity to address any issues and increase their chances of acceptance once they do submit to a journal.

Dr. John Hammersley, co-founder at writeLaTeX says “Scientific publishing is evolving and we’re keen to offer our users a wide-range of destinations for their work. Rubriq offers a new alternative to the traditional publication route, and we’re delighted to be working with them to streamline submissions to their peer-review service.”

After a manuscript has been completed in the writeLaTeX system, authors simply select “Submit to Rubriq for Peer Review” (either in the top bar of your writeLaTeX screen or through the “Publish” menu). Then in one click, manuscript files and metadata will be passed over to the Rubriq submission system. Once the report has been completed, the authors of a paper can go back and make their revisions in real-time on the same document in the writeLaTeX system.

 

###

 

About Rubriq

Rubriq (www.rubriq.com) delivers objective, critical, pre-submission peer review of academic manuscripts from three carefully matched expert reviewers. Rubriq reviewers are all active PhD- or MD-level academics with established publishing and review experience. Reviewers are compensated for their work, and reviews are returned to the authors within two weeks. Authors pay $600 for the three reviewer report, which covers the cost of reviewer recruitment and compensation. Our standardized, structured scorecards span all areas of study and include detailed comments as well as numeric scores, which allow the author to quickly understand the strengths and weaknesses of the manuscript, as assessed by experts in the field. Rubriq is a division of Research Square, which makes it a sister company to and AJE (manuscript preparation services) and JournalGuide (free online journal search tool). To find out more about Research Square, visit www.researchsquare.com.

 

About writeLaTeX

WriteLaTeX is a free service that lets you create, edit and share your scientific ideas easily online using LaTeX, a comprehensive and powerful tool for scientific writing. The company has grown rapidly since its launch in 2011, and today there are tens of thousands of active users who’ve created over a million projects. WriteLaTeX was founded by John Hammersley and John Lees-Miller, two mathematicians who worked together on the pioneering Ultra PRT Project and who were inspired by their own experiences in academia to create a better solution for collaborative scientific writing. In 2014, Overleaf aims to make science and R&D faster, more open and more transparent by bringing the whole scientific process into the cloud, from idea to writing to review to publication. Overleaf makes the power of professional typesetting immediately accessible to all scientists and technical writers at all stages of their career. Find out more at www.writelatex.com/overleaf.

1 Comment

Filed under People & Partners

Rubriq Presentation from SSP Annual Meeting 2013

Missed us at SSP? Want to know the latest things we’re cooking up at Rubriq? In this video Keith Collier re-presents all of his slides from the SSP session (Concurrent 4E: The Future of Peer Review: Game Changers). This presentation gives a detailed (~20 min) overview of our independent peer review service, as well as a preview of one of our new free author tools. It was designed for the SSP audience, which is primarily journals, editors, and publishers.

From the SSP 2013 Annual Meeting – http://www.sspnet.org/events/past-events/2013-annual-meeting/schedule/

Leave a comment

June 19, 2013 · 7:51 pm

How we found 15 million hours of lost time

Lost time in the current peer review process

Rubriq wants to recover lost hours from redundant reviews so they can be put back into research. In the current journal submission process, rejection is common, yet reviews are rarely shared from one journal to the next.  Even when reviews are passed along, the lack of an industry-wide standard means that each journal in the chain solicits its own reviews before making a decision. All of this leads to reviewers repeating work that has already been done on the same manuscript by other colleagues.  We estimate that over 15 million hours are spent on redundant or unnecessary reviews – every year. 

Here’s a video that helps illustrate the key issues:

(once it starts, you can click on “HD” in the right-hand corner to view on highest resolution)

So how did we get to that number of 15 million hours each year?

The two key metrics for finding wasted time are quantity (how many manuscripts are reviewed and then rejected?) and time (how long does each submission one take to be reviewed?). While there are 28,000 peer reviewed journals, we only use 12,000 in our calculations since that is roughly the number of high-quality journals that are included in Thomson Reuters’ Web of Science.  The figure below shows how we calculated both quantity and time, and the descriptions and citations for the key steps in the process follow:

Rubriq calculation of time lost to peer review - click to view larger

Rubriq calculation of time lost to peer review – click to view larger

 

Calculation & Source Details:

1.  3,360,207 (English-language, STM) submissions per year

  • Although the MarkWare STM report1 showed that there are over 28,000 peer-reviewed journals, we focused our scope within just the 12,000 English-language STM journals identified in that same report, as they are the current focus for Rubriq.
  • The average submissions per journal were shown in the Thompson Reuters data2 as 280 (total ScholarOne submissions divided by the count of ScholarOne journal sites). Calculating 280 submissions for each of the 12,000 journals equals 3,360,207 submissions per year.
  • Note that this is submission-based data, not paper-based.  A single manuscript that was rejected by one journal but then accepted by another within the same year would go through two review cycles and thus recognized as two separate submissions.

2.  1,344,099 (40%) accepted submissions per year

  • Thompson Reuters data2 reports 37% acceptance based on all submissions received and accepted within their system, but the MarkWare PRC report3 estimated an average of 50%.
  • We feel the Thomson Reuters data is more accurate than PRC data based on how the information was collected and how calculations were made. Combined with our own internal data and personal interviews with some of the largest STM publishers, we selected 40% as the best representation for this group of journals. 40% of our total submission number equals 1,344,099 accepted papers.

3.  705,652 (21%) submissions per year rejected WITHOUT Review

  • The MarkWare PRC report3 stated 21% as its estimate for submissions that are rejected without going through peer review, also known as a “desk rejection”.
  • Although there is time lost and an opportunity cost to the author when this occurs and they have to try again with another journal, we are currently only focused on time spent on peer-review, so do not factor this group in with our calculation of wasted time.

4.  1,310,496 (39%) submissions per year rejected WITH Review

  • The number of submissions that are sent to peer review but are then rejected is our key starting metric for calculating lost hours (why? See our “Additional Reading” section below for some background material). We use the two preceding calculations to find this number.
  • If 21% were rejected without review, and 40% were accepted, then the remaining submissions were rejected after the peer review process.  Applying 39% to our total gives us 1,310,496.

5.  11.5 average reviewer hours spent per submission

  • Data from the MarkWare STM report4 provided us with an average (median) of five hours spent per review.
  • The MarkWare PRC report3 states that an average of 2.3 reviewers is used for each submission.
  • Five hours * 2.3 reviewers equals 11.5 average review hours per submission.
  • Note that this number only takes into account the time spent per submission by reviewers – it does not include time spent by the journal or publisher in coordinating the review process (e.g., recruiting reviewers, editorial check of reviews, review software costs) or other time spent processing these papers (e.g., screening, editorial review, technical check, other operational time).

6. 15,070,706 hours per year spent on redundant reviews

  • Assuming 11.5 hours per submission * 1,310,496 submissions that were reviewed but then rejected = over 15 million hours. Every year.
  • Since there are only 8,760 hours in a year, you can also think of it as equaling 1,720 years (if it was all one reviewer working 24 hrs per day).

 

 References/Links:

1. M. Ware, M. Mabe, The STM Report: An overview of scientific and scholarly journal publishing (International Association of Scientific, Technical, and Medical Publishers, Oxford, UK, 2012; http://www.stm-assoc.org/2012_12_11_STM_Report_2012.pdf)

2. Thomson Reuters, Global Publishing: Changes in submission trends and the impact on scholarly publishers (April 2012: http://scholarone.com/about/industry_insights/).

3. M. Ware, Peer review: benefits, perceptions, and alternatives (Publishing Research Consortium, London UK, 2008; http://www.publishingresearch.net/documents/PRCsummary4Warefinal.pdf)

4. Mark Ware (2011): Peer Review: Recent Experience and Future

Directions, New Review of Information Networking, 16:1, 23-53 http://dx.doi.org/10.1080/13614576.2011.566812

 

Have other questions? Found a better number with your own calculations? Feel free to add your comments here on our blog!

 

2 Comments

Filed under Uncategorized