Compare commits

..

No commits in common. "ext-ce" and "v5.4.1" have entirely different histories.

2276 changed files with 56978 additions and 103712 deletions

View file

@ -1,19 +1,10 @@
---
name: Bug report
about: Report a bug
title: ''
labels: type:bug
assignees: ''
---
<!--
Note: If you are using www.overleaf.com and have a problem,
Note: If you are using www.overleaf.com and have a problem,
or if you would like to request a new feature please contact
the support team at support@overleaf.com
This form should only be used to report bugs in the
This form should only be used to report bugs in the
Community Edition release of Overleaf.
-->

View file

@ -14,52 +14,39 @@
<a href="#license">License</a>
</p>
<img src="doc/screenshot.png" alt="A screenshot of a project being edited in Overleaf Extended Community Edition">
<img src="doc/screenshot.png" alt="A screenshot of a project being edited in Overleaf Community Edition">
<p align="center">
Figure 1: A screenshot of a project being edited in Overleaf Extended Community Edition.
Figure 1: A screenshot of a project being edited in Overleaf Community Edition.
</p>
## Community Edition
[Overleaf](https://www.overleaf.com) is an open-source online real-time collaborative LaTeX editor. Overleaf runs a hosted version at [www.overleaf.com](https://www.overleaf.com), but you can also run your own local version, and contribute to the development of Overleaf.
## Extended Community Edition
The present "extended" version of Overleaf CE includes:
- Template Gallery
- Sandboxed Compiles with TeX Live image selection
- LDAP authentication
- SAML authentication
- OpenID Connect authentication
- Real-time track changes and comments
- Autocomplete of reference keys
- Symbol Palette
- "From External URL" feature
> [!CAUTION]
> Overleaf Community Edition is intended for use in environments where **all** users are trusted. Community Edition is **not** appropriate for scenarios where isolation of users is required due to Sandbox Compiles not being available. When not using Sandboxed Compiles, users have full read and write access to the `sharelatex` container resources (filesystem, network, environment variables) when running LaTeX compiles.
Therefore, in any environment where not all users can be fully trusted, it is strongly recommended to enable the Sandboxed Compiles feature available in the Extended Community Edition.
For more information on Sandbox Compiles check out Overleaf [documentation](https://docs.overleaf.com/on-premises/configuration/overleaf-toolkit/server-pro-only-configuration/sandboxed-compiles).
[Overleaf](https://www.overleaf.com) is an open-source online real-time collaborative LaTeX editor. We run a hosted version at [www.overleaf.com](https://www.overleaf.com), but you can also run your own local version, and contribute to the development of Overleaf.
## Enterprise
If you want help installing and maintaining Overleaf in your lab or workplace, Overleaf offers an officially supported version called [Overleaf Server Pro](https://www.overleaf.com/for/enterprises).
If you want help installing and maintaining Overleaf in your lab or workplace, we offer an officially supported version called [Overleaf Server Pro](https://www.overleaf.com/for/enterprises). It also includes more features for security (SSO with LDAP or SAML), administration and collaboration (e.g. tracked changes). [Find out more!](https://www.overleaf.com/for/enterprises)
## Keeping up to date
Sign up to the [mailing list](https://mailchi.mp/overleaf.com/community-edition-and-server-pro) to get updates on Overleaf releases and development.
## Installation
Detailed installation instructions can be found in the [Overleaf Toolkit](https://github.com/overleaf/toolkit/).
Configuration details and release history for the Extended Community Edition can be found on the [Extended CE Wiki Page](https://github.com/yu-i-i/overleaf-cep/wiki).
We have detailed installation instructions in the [Overleaf Toolkit](https://github.com/overleaf/toolkit/).
## Upgrading
If you are upgrading from a previous version of Overleaf, please see the [Release Notes section on the Wiki](https://github.com/overleaf/overleaf/wiki#release-notes) for all of the versions between your current version and the version you are upgrading to.
## Overleaf Docker Image
This repo contains two dockerfiles, [`Dockerfile-base`](server-ce/Dockerfile-base), which builds the
`sharelatex/sharelatex-base:ext-ce` image, and [`Dockerfile`](server-ce/Dockerfile) which builds the
`sharelatex/sharelatex:ext-ce` image.
`sharelatex/sharelatex-base` image, and [`Dockerfile`](server-ce/Dockerfile) which builds the
`sharelatex/sharelatex` (or "community") image.
The Base image generally contains the basic dependencies like `wget`, plus `texlive`.
This is split out because it's a pretty heavy set of
We split this out because it's a pretty heavy set of
dependencies, and it's nice to not have to rebuild all of that every time.
The `sharelatex/sharelatex` image extends the base image and adds the actual Overleaf code
@ -67,16 +54,20 @@ and services.
Use `make build-base` and `make build-community` from `server-ce/` to build these images.
The [Phusion base-image](https://github.com/phusion/baseimage-docker)
(which is extended by the `base` image) provides a VM-like container
We use the [Phusion base-image](https://github.com/phusion/baseimage-docker)
(which is extended by our `base` image) to provide us with a VM-like container
in which to run the Overleaf services. Baseimage uses the `runit` service
manager to manage services, and init scripts from the `server-ce/runit`
folder are added.
manager to manage services, and we add our init-scripts from the `server-ce/runit`
folder.
## Contributing
Please see the [CONTRIBUTING](CONTRIBUTING.md) file for information on contributing to the development of Overleaf.
## Authors
[The Overleaf Team](https://www.overleaf.com/about)
[yu-i-i](https://github.com/yu-i-i/overleaf-cep) — Extensions for CE unless otherwise noted
[The Overleaf Team](https://www.overleaf.com/about)
## License

View file

@ -42,7 +42,7 @@ To do this, use the included `bin/dev` script:
bin/dev
```
This will start all services using `node --watch`, which will automatically monitor the code and restart the services as necessary.
This will start all services using `nodemon`, which will automatically monitor the code and restart the services as necessary.
To improve performance, you can start only a subset of the services in development mode by providing a space-separated list to the `bin/dev` script:
@ -77,7 +77,6 @@ each service:
| `filestore` | 9235 |
| `notifications` | 9236 |
| `real-time` | 9237 |
| `references` | 9238 |
| `history-v1` | 9239 |
| `project-history` | 9240 |

View file

@ -6,18 +6,14 @@ DOCUMENT_UPDATER_HOST=document-updater
FILESTORE_HOST=filestore
GRACEFUL_SHUTDOWN_DELAY_SECONDS=0
HISTORY_V1_HOST=history-v1
HISTORY_REDIS_HOST=redis
LISTEN_ADDRESS=0.0.0.0
MONGO_HOST=mongo
MONGO_URL=mongodb://mongo/sharelatex?directConnection=true
NOTIFICATIONS_HOST=notifications
PROJECT_HISTORY_HOST=project-history
QUEUES_REDIS_HOST=redis
REALTIME_HOST=real-time
REDIS_HOST=redis
REFERENCES_HOST=references
SESSION_SECRET=foo
V1_HISTORY_HOST=history-v1
WEBPACK_HOST=webpack
WEB_API_PASSWORD=overleaf
WEB_API_USER=overleaf

View file

@ -112,19 +112,8 @@ services:
- ../services/real-time/app.js:/overleaf/services/real-time/app.js
- ../services/real-time/config:/overleaf/services/real-time/config
references:
command: ["node", "--watch", "app.js"]
environment:
- NODE_OPTIONS=--inspect=0.0.0.0:9229
ports:
- "127.0.0.1:9238:9229"
volumes:
- ../services/references/app:/overleaf/services/references/app
- ../services/references/config:/overleaf/services/references/config
- ../services/references/app.js:/overleaf/services/references/app.js
web:
command: ["node", "--watch", "app.mjs", "--watch-locales"]
command: ["node", "--watch", "app.js", "--watch-locales"]
environment:
- NODE_OPTIONS=--inspect=0.0.0.0:9229
ports:

View file

@ -1,5 +1,6 @@
volumes:
clsi-cache:
clsi-output:
filestore-public-files:
filestore-template-files:
filestore-uploads:
@ -25,16 +26,15 @@ services:
env_file:
- dev.env
environment:
- DOCKER_RUNNER=true
- TEXLIVE_IMAGE=texlive-full # docker build texlive -t texlive-full
- SANDBOXED_COMPILES=true
- SANDBOXED_COMPILES_HOST_DIR_COMPILES=${PWD}/compiles
- SANDBOXED_COMPILES_HOST_DIR_OUTPUT=${PWD}/output
- COMPILES_HOST_DIR=${PWD}/compiles
user: root
volumes:
- ${PWD}/compiles:/overleaf/services/clsi/compiles
- ${PWD}/output:/overleaf/services/clsi/output
- ${DOCKER_SOCKET_PATH:-/var/run/docker.sock}:/var/run/docker.sock
- clsi-cache:/overleaf/services/clsi/cache
- clsi-output:/overleaf/services/clsi/output
contacts:
build:
@ -123,7 +123,7 @@ services:
dockerfile: services/real-time/Dockerfile
env_file:
- dev.env
redis:
image: redis:5
ports:
@ -131,13 +131,6 @@ services:
volumes:
- redis-data:/data
references:
build:
context: ..
dockerfile: services/references/Dockerfile
env_file:
- dev.env
web:
build:
context: ..
@ -147,7 +140,7 @@ services:
- dev.env
environment:
- APP_NAME=Overleaf Community Edition
- ENABLED_LINKED_FILE_TYPES=project_file,project_output_file,url
- ENABLED_LINKED_FILE_TYPES=project_file,project_output_file
- EMAIL_CONFIRMATION_DISABLED=true
- NODE_ENV=development
- OVERLEAF_ALLOW_PUBLIC_ACCESS=true
@ -168,7 +161,6 @@ services:
- notifications
- project-history
- real-time
- references
webpack:
build:

Binary file not shown.

Before

Width:  |  Height:  |  Size: 13 KiB

After

Width:  |  Height:  |  Size: 71 KiB

Before After
Before After

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1 MiB

After

Width:  |  Height:  |  Size: 587 KiB

Before After
Before After

View file

@ -32,7 +32,7 @@ services:
OVERLEAF_REDIS_HOST: redis
REDIS_HOST: redis
ENABLED_LINKED_FILE_TYPES: 'project_file,project_output_file,url'
ENABLED_LINKED_FILE_TYPES: 'project_file,project_output_file'
# Enables Thumbnail generation using ImageMagick
ENABLE_CONVERSIONS: 'true'
@ -73,19 +73,11 @@ services:
## Server Pro ##
################
## The Community Edition is intended for use in environments where all users are trusted and is not appropriate for
## scenarios where isolation of users is required. Sandboxed Compiles are not available in the Community Edition,
## so the following environment variables must be commented out to avoid compile issues.
##
## Sandboxed Compiles: https://docs.overleaf.com/on-premises/configuration/overleaf-toolkit/server-pro-only-configuration/sandboxed-compiles
## Sandboxed Compiles: https://github.com/overleaf/overleaf/wiki/Server-Pro:-Sandboxed-Compiles
SANDBOXED_COMPILES: 'true'
### Bind-mount source for /var/lib/overleaf/data/compiles inside the container.
SANDBOXED_COMPILES_HOST_DIR_COMPILES: '/home/user/sharelatex_data/data/compiles'
### Bind-mount source for /var/lib/overleaf/data/output inside the container.
SANDBOXED_COMPILES_HOST_DIR_OUTPUT: '/home/user/sharelatex_data/data/output'
### Backwards compatibility (before Server Pro 5.5)
DOCKER_RUNNER: 'true'
SANDBOXED_COMPILES_SIBLING_CONTAINERS: 'true'
### Bind-mount source for /var/lib/overleaf/data/compiles inside the container.
SANDBOXED_COMPILES_HOST_DIR: '/home/user/sharelatex_data/data/compiles'
## Works with test LDAP server shown at bottom of docker compose
# OVERLEAF_LDAP_URL: 'ldap://ldap:389'

View file

@ -0,0 +1 @@
node_modules/

View file

@ -0,0 +1,46 @@
compileFolder
Compiled source #
###################
*.com
*.class
*.dll
*.exe
*.o
*.so
# Packages #
############
# it's better to unpack these files and commit the raw source
# git has its own built in compression methods
*.7z
*.dmg
*.gz
*.iso
*.jar
*.rar
*.tar
*.zip
# Logs and databases #
######################
*.log
*.sql
*.sqlite
# OS generated files #
######################
.DS_Store?
ehthumbs.db
Icon?
Thumbs.db
/node_modules/*
data/*/*
**.swp
/log.json
hash_folder
.npmrc

View file

@ -1 +1 @@
22.17.0
20.18.2

View file

@ -1,10 +1,10 @@
access-token-encryptor
--dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
--docker-repos=gcr.io/overleaf-ops
--env-add=
--env-pass-through=
--esmock-loader=False
--is-library=True
--node-version=22.17.0
--node-version=20.18.2
--public-repo=False
--script-version=4.7.0
--script-version=4.5.0

View file

@ -0,0 +1 @@
node_modules/

3
libraries/fetch-utils/.gitignore vendored Normal file
View file

@ -0,0 +1,3 @@
# managed by monorepo$ bin/update_build_scripts
.npmrc

View file

@ -1 +1 @@
22.17.0
20.18.2

View file

@ -1,10 +1,10 @@
fetch-utils
--dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
--docker-repos=gcr.io/overleaf-ops
--env-add=
--env-pass-through=
--esmock-loader=False
--is-library=True
--node-version=22.17.0
--node-version=20.18.2
--public-repo=False
--script-version=4.7.0
--script-version=4.5.0

View file

@ -23,11 +23,11 @@ async function fetchJson(url, opts = {}) {
}
async function fetchJsonWithResponse(url, opts = {}) {
const { fetchOpts, detachSignal } = parseOpts(opts)
const { fetchOpts } = parseOpts(opts)
fetchOpts.headers = fetchOpts.headers ?? {}
fetchOpts.headers.Accept = fetchOpts.headers.Accept ?? 'application/json'
const response = await performRequest(url, fetchOpts, detachSignal)
const response = await performRequest(url, fetchOpts)
if (!response.ok) {
const body = await maybeGetResponseBody(response)
throw new RequestFailedError(url, opts, response, body)
@ -53,8 +53,8 @@ async function fetchStream(url, opts = {}) {
}
async function fetchStreamWithResponse(url, opts = {}) {
const { fetchOpts, abortController, detachSignal } = parseOpts(opts)
const response = await performRequest(url, fetchOpts, detachSignal)
const { fetchOpts, abortController } = parseOpts(opts)
const response = await performRequest(url, fetchOpts)
if (!response.ok) {
const body = await maybeGetResponseBody(response)
@ -76,8 +76,8 @@ async function fetchStreamWithResponse(url, opts = {}) {
* @throws {RequestFailedError} if the response has a failure status code
*/
async function fetchNothing(url, opts = {}) {
const { fetchOpts, detachSignal } = parseOpts(opts)
const response = await performRequest(url, fetchOpts, detachSignal)
const { fetchOpts } = parseOpts(opts)
const response = await performRequest(url, fetchOpts)
if (!response.ok) {
const body = await maybeGetResponseBody(response)
throw new RequestFailedError(url, opts, response, body)
@ -108,9 +108,9 @@ async function fetchRedirect(url, opts = {}) {
* @throws {RequestFailedError} if the response has a non redirect status code or missing Location header
*/
async function fetchRedirectWithResponse(url, opts = {}) {
const { fetchOpts, detachSignal } = parseOpts(opts)
const { fetchOpts } = parseOpts(opts)
fetchOpts.redirect = 'manual'
const response = await performRequest(url, fetchOpts, detachSignal)
const response = await performRequest(url, fetchOpts)
if (response.status < 300 || response.status >= 400) {
const body = await maybeGetResponseBody(response)
throw new RequestFailedError(url, opts, response, body)
@ -142,8 +142,8 @@ async function fetchString(url, opts = {}) {
}
async function fetchStringWithResponse(url, opts = {}) {
const { fetchOpts, detachSignal } = parseOpts(opts)
const response = await performRequest(url, fetchOpts, detachSignal)
const { fetchOpts } = parseOpts(opts)
const response = await performRequest(url, fetchOpts)
if (!response.ok) {
const body = await maybeGetResponseBody(response)
throw new RequestFailedError(url, opts, response, body)
@ -178,14 +178,13 @@ function parseOpts(opts) {
const abortController = new AbortController()
fetchOpts.signal = abortController.signal
let detachSignal = () => {}
if (opts.signal) {
detachSignal = abortOnSignal(abortController, opts.signal)
abortOnSignal(abortController, opts.signal)
}
if (opts.body instanceof Readable) {
abortOnDestroyedRequest(abortController, fetchOpts.body)
}
return { fetchOpts, abortController, detachSignal }
return { fetchOpts, abortController }
}
function setupJsonBody(fetchOpts, json) {
@ -209,9 +208,6 @@ function abortOnSignal(abortController, signal) {
abortController.abort(signal.reason)
}
signal.addEventListener('abort', listener)
return () => {
signal.removeEventListener('abort', listener)
}
}
function abortOnDestroyedRequest(abortController, stream) {
@ -230,12 +226,11 @@ function abortOnDestroyedResponse(abortController, response) {
})
}
async function performRequest(url, fetchOpts, detachSignal) {
async function performRequest(url, fetchOpts) {
let response
try {
response = await fetch(url, fetchOpts)
} catch (err) {
detachSignal()
if (fetchOpts.body instanceof Readable) {
fetchOpts.body.destroy()
}
@ -244,7 +239,6 @@ async function performRequest(url, fetchOpts, detachSignal) {
method: fetchOpts.method ?? 'GET',
})
}
response.body.on('close', detachSignal)
if (fetchOpts.body instanceof Readable) {
response.body.on('close', () => {
if (!fetchOpts.body.readableEnded) {

View file

@ -1,9 +1,6 @@
const { expect } = require('chai')
const fs = require('node:fs')
const events = require('node:events')
const { FetchError, AbortError } = require('node-fetch')
const { Readable } = require('node:stream')
const { pipeline } = require('node:stream/promises')
const { once } = require('node:events')
const { TestServer } = require('./helpers/TestServer')
const selfsigned = require('selfsigned')
@ -206,31 +203,6 @@ describe('fetch-utils', function () {
).to.be.rejectedWith(AbortError)
expect(stream.destroyed).to.be.true
})
it('detaches from signal on success', async function () {
const signal = AbortSignal.timeout(10_000)
for (let i = 0; i < 20; i++) {
const s = await fetchStream(this.url('/hello'), { signal })
expect(events.getEventListeners(signal, 'abort')).to.have.length(1)
await pipeline(s, fs.createWriteStream('/dev/null'))
expect(events.getEventListeners(signal, 'abort')).to.have.length(0)
}
})
it('detaches from signal on error', async function () {
const signal = AbortSignal.timeout(10_000)
for (let i = 0; i < 20; i++) {
try {
await fetchStream(this.url('/500'), { signal })
} catch (err) {
if (err instanceof RequestFailedError && err.response.status === 500)
continue
throw err
} finally {
expect(events.getEventListeners(signal, 'abort')).to.have.length(0)
}
}
})
})
describe('fetchNothing', function () {
@ -419,16 +391,9 @@ async function* infiniteIterator() {
async function abortOnceReceived(func, server) {
const controller = new AbortController()
const promise = func(controller.signal)
expect(events.getEventListeners(controller.signal, 'abort')).to.have.length(1)
await once(server.events, 'request-received')
controller.abort()
try {
return await promise
} finally {
expect(events.getEventListeners(controller.signal, 'abort')).to.have.length(
0
)
}
return await promise
}
async function expectRequestAborted(req) {

View file

@ -0,0 +1 @@
node_modules/

3
libraries/logger/.gitignore vendored Normal file
View file

@ -0,0 +1,3 @@
node_modules
.npmrc

View file

@ -1 +1 @@
22.17.0
20.18.2

View file

@ -1,10 +1,10 @@
logger
--dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
--docker-repos=gcr.io/overleaf-ops
--env-add=
--env-pass-through=
--esmock-loader=False
--is-library=True
--node-version=22.17.0
--node-version=20.18.2
--public-repo=False
--script-version=4.7.0
--script-version=4.5.0

View file

@ -0,0 +1 @@
node_modules/

3
libraries/metrics/.gitignore vendored Normal file
View file

@ -0,0 +1,3 @@
node_modules
.npmrc

View file

@ -1 +1 @@
22.17.0
20.18.2

View file

@ -1,10 +1,10 @@
metrics
--dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
--docker-repos=gcr.io/overleaf-ops
--env-add=
--env-pass-through=
--esmock-loader=False
--is-library=True
--node-version=22.17.0
--node-version=20.18.2
--public-repo=False
--script-version=4.7.0
--script-version=4.5.0

View file

@ -5,8 +5,6 @@
* before any other module to support code instrumentation.
*/
const metricsModuleImportStartTime = performance.now()
const APP_NAME = process.env.METRICS_APP_NAME || 'unknown'
const BUILD_VERSION = process.env.BUILD_VERSION
const ENABLE_PROFILE_AGENT = process.env.ENABLE_PROFILE_AGENT === 'true'
@ -105,5 +103,3 @@ function recordProcessStart() {
const metrics = require('.')
metrics.inc('process_startup')
}
module.exports = { metricsModuleImportStartTime }

View file

@ -9,7 +9,7 @@
"main": "index.js",
"dependencies": {
"@google-cloud/opentelemetry-cloud-trace-exporter": "^2.1.0",
"@google-cloud/profiler": "^6.0.3",
"@google-cloud/profiler": "^6.0.0",
"@opentelemetry/api": "^1.4.1",
"@opentelemetry/auto-instrumentations-node": "^0.39.1",
"@opentelemetry/exporter-trace-otlp-http": "^0.41.2",

View file

@ -0,0 +1 @@
node_modules/

3
libraries/mongo-utils/.gitignore vendored Normal file
View file

@ -0,0 +1,3 @@
# managed by monorepo$ bin/update_build_scripts
.npmrc

View file

@ -1 +1 @@
22.17.0
20.18.2

View file

@ -16,7 +16,6 @@ let VERBOSE_LOGGING
let BATCH_RANGE_START
let BATCH_RANGE_END
let BATCH_MAX_TIME_SPAN_IN_MS
let BATCHED_UPDATE_RUNNING = false
/**
* @typedef {import("mongodb").Collection} Collection
@ -35,7 +34,6 @@ let BATCHED_UPDATE_RUNNING = false
* @property {string} [BATCH_RANGE_START]
* @property {string} [BATCH_SIZE]
* @property {string} [VERBOSE_LOGGING]
* @property {(progress: string) => Promise<void>} [trackProgress]
*/
/**
@ -211,71 +209,59 @@ async function batchedUpdate(
update,
projection,
findOptions,
batchedUpdateOptions = {}
batchedUpdateOptions
) {
// only a single batchedUpdate can run at a time due to global variables
if (BATCHED_UPDATE_RUNNING) {
throw new Error('batchedUpdate is already running')
ID_EDGE_PAST = await getIdEdgePast(collection)
if (!ID_EDGE_PAST) {
console.warn(
`The collection ${collection.collectionName} appears to be empty.`
)
return 0
}
try {
BATCHED_UPDATE_RUNNING = true
ID_EDGE_PAST = await getIdEdgePast(collection)
if (!ID_EDGE_PAST) {
console.warn(
`The collection ${collection.collectionName} appears to be empty.`
)
return 0
}
refreshGlobalOptionsForBatchedUpdate(batchedUpdateOptions)
const { trackProgress = async progress => console.warn(progress) } =
batchedUpdateOptions
refreshGlobalOptionsForBatchedUpdate(batchedUpdateOptions)
findOptions = findOptions || {}
findOptions.readPreference = READ_PREFERENCE_SECONDARY
findOptions = findOptions || {}
findOptions.readPreference = READ_PREFERENCE_SECONDARY
projection = projection || { _id: 1 }
let nextBatch
let updated = 0
let start = BATCH_RANGE_START
projection = projection || { _id: 1 }
let nextBatch
let updated = 0
let start = BATCH_RANGE_START
while (start !== BATCH_RANGE_END) {
let end = getNextEnd(start)
nextBatch = await getNextBatch(
collection,
query,
start,
end,
projection,
findOptions
)
if (nextBatch.length > 0) {
end = nextBatch[nextBatch.length - 1]._id
updated += nextBatch.length
while (start !== BATCH_RANGE_END) {
let end = getNextEnd(start)
nextBatch = await getNextBatch(
collection,
query,
start,
end,
projection,
findOptions
)
if (nextBatch.length > 0) {
end = nextBatch[nextBatch.length - 1]._id
updated += nextBatch.length
if (VERBOSE_LOGGING) {
console.log(
`Running update on batch with ids ${JSON.stringify(
nextBatch.map(entry => entry._id)
)}`
)
}
await trackProgress(
`Running update on batch ending ${renderObjectId(end)}`
if (VERBOSE_LOGGING) {
console.log(
`Running update on batch with ids ${JSON.stringify(
nextBatch.map(entry => entry._id)
)}`
)
if (typeof update === 'function') {
await update(nextBatch)
} else {
await performUpdate(collection, nextBatch, update)
}
} else {
console.error(`Running update on batch ending ${renderObjectId(end)}`)
}
if (typeof update === 'function') {
await update(nextBatch)
} else {
await performUpdate(collection, nextBatch, update)
}
await trackProgress(`Completed batch ending ${renderObjectId(end)}`)
start = end
}
return updated
} finally {
BATCHED_UPDATE_RUNNING = false
console.error(`Completed batch ending ${renderObjectId(end)}`)
start = end
}
return updated
}
/**

View file

@ -1,10 +1,10 @@
mongo-utils
--dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
--docker-repos=gcr.io/overleaf-ops
--env-add=
--env-pass-through=
--esmock-loader=False
--is-library=True
--node-version=22.17.0
--node-version=20.18.2
--public-repo=False
--script-version=4.7.0
--script-version=4.5.0

View file

@ -0,0 +1 @@
node_modules/

5
libraries/o-error/.gitignore vendored Normal file
View file

@ -0,0 +1,5 @@
.nyc_output
coverage
node_modules/
.npmrc

View file

@ -1 +1 @@
22.17.0
20.18.2

View file

@ -1,10 +1,10 @@
o-error
--dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
--docker-repos=gcr.io/overleaf-ops
--env-add=
--env-pass-through=
--esmock-loader=False
--is-library=True
--node-version=22.17.0
--node-version=20.18.2
--public-repo=False
--script-version=4.7.0
--script-version=4.5.0

View file

@ -1,34 +1,20 @@
// @ts-check
/**
* Light-weight helpers for handling JavaScript Errors in node.js and the
* browser.
*/
class OError extends Error {
/**
* The error that is the underlying cause of this error
*
* @type {unknown}
*/
cause
/**
* List of errors encountered as the callback chain is unwound
*
* @type {TaggedError[] | undefined}
*/
_oErrorTags
/**
* @param {string} message as for built-in Error
* @param {Object} [info] extra data to attach to the error
* @param {unknown} [cause] the internal error that caused this error
* @param {Error} [cause] the internal error that caused this error
*/
constructor(message, info, cause) {
super(message)
this.name = this.constructor.name
if (info) this.info = info
if (cause) this.cause = cause
/** @private @type {Array<TaggedError> | undefined} */
this._oErrorTags // eslint-disable-line
}
/**
@ -45,7 +31,7 @@ class OError extends Error {
/**
* Wrap the given error, which caused this error.
*
* @param {unknown} cause the internal error that caused this error
* @param {Error} cause the internal error that caused this error
* @return {this}
*/
withCause(cause) {
@ -79,16 +65,13 @@ class OError extends Error {
* }
* }
*
* @template {unknown} E
* @param {E} error the error to tag
* @param {Error} error the error to tag
* @param {string} [message] message with which to tag `error`
* @param {Object} [info] extra data with wich to tag `error`
* @return {E} the modified `error` argument
* @return {Error} the modified `error` argument
*/
static tag(error, message, info) {
const oError = /** @type {{ _oErrorTags: TaggedError[] | undefined }} */ (
error
)
const oError = /** @type{OError} */ (error)
if (!oError._oErrorTags) oError._oErrorTags = []
@ -119,7 +102,7 @@ class OError extends Error {
*
* If an info property is repeated, the last one wins.
*
* @param {unknown} error any error (may or may not be an `OError`)
* @param {Error | null | undefined} error any error (may or may not be an `OError`)
* @return {Object}
*/
static getFullInfo(error) {
@ -146,7 +129,7 @@ class OError extends Error {
* Return the `stack` property from `error`, including the `stack`s for any
* tagged errors added with `OError.tag` and for any `cause`s.
*
* @param {unknown} error any error (may or may not be an `OError`)
* @param {Error | null | undefined} error any error (may or may not be an `OError`)
* @return {string}
*/
static getFullStack(error) {
@ -160,7 +143,7 @@ class OError extends Error {
stack += `\n${oError._oErrorTags.map(tag => tag.stack).join('\n')}`
}
const causeStack = OError.getFullStack(oError.cause)
const causeStack = oError.cause && OError.getFullStack(oError.cause)
if (causeStack) {
stack += '\ncaused by:\n' + indent(causeStack)
}

View file

@ -268,11 +268,6 @@ describe('utils', function () {
expect(OError.getFullInfo(null)).to.deep.equal({})
})
it('works when given a string', function () {
const err = 'not an error instance'
expect(OError.getFullInfo(err)).to.deep.equal({})
})
it('works on a normal error', function () {
const err = new Error('foo')
expect(OError.getFullInfo(err)).to.deep.equal({})

View file

@ -35,14 +35,6 @@ describe('OError', function () {
expect(err2.cause.message).to.equal('cause 2')
})
it('accepts non-Error causes', function () {
const err1 = new OError('foo', {}, 'not-an-error')
expect(err1.cause).to.equal('not-an-error')
const err2 = new OError('foo').withCause('not-an-error')
expect(err2.cause).to.equal('not-an-error')
})
it('handles a custom error type with a cause', function () {
function doSomethingBadInternally() {
throw new Error('internal error')

View file

@ -0,0 +1 @@
node_modules/

4
libraries/object-persistor/.gitignore vendored Normal file
View file

@ -0,0 +1,4 @@
/node_modules
*.swp
.npmrc

View file

@ -1 +1 @@
22.17.0
20.18.2

View file

@ -1,10 +1,10 @@
object-persistor
--dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
--docker-repos=gcr.io/overleaf-ops
--env-add=
--env-pass-through=
--esmock-loader=False
--is-library=True
--node-version=22.17.0
--node-version=20.18.2
--public-repo=False
--script-version=4.7.0
--script-version=4.5.0

View file

@ -305,10 +305,8 @@ module.exports = class FSPersistor extends AbstractPersistor {
async _listDirectory(path) {
if (this.useSubdirectories) {
// eslint-disable-next-line @typescript-eslint/return-await
return await glob(Path.join(path, '**'))
} else {
// eslint-disable-next-line @typescript-eslint/return-await
return await glob(`${path}_*`)
}
}

View file

@ -33,10 +33,6 @@ const AES256_KEY_LENGTH = 32
* @property {() => Promise<Array<RootKeyEncryptionKey>>} getRootKeyEncryptionKeys
*/
/**
* @typedef {import('./types').ListDirectoryResult} ListDirectoryResult
*/
/**
* Helper function to make TS happy when accessing error properties
* AWSError is not an actual class, so we cannot use instanceof.
@ -395,9 +391,9 @@ class PerProjectEncryptedS3Persistor extends S3Persistor {
* A general "cache" for project keys is another alternative. For now, use a helper class.
*/
class CachedPerProjectEncryptedS3Persistor {
/** @type SSECOptions */
/** @type SSECOptions */
#projectKeyOptions
/** @type PerProjectEncryptedS3Persistor */
/** @type PerProjectEncryptedS3Persistor */
#parent
/**
@ -428,16 +424,6 @@ class CachedPerProjectEncryptedS3Persistor {
return await this.#parent.getObjectSize(bucketName, path)
}
/**
*
* @param {string} bucketName
* @param {string} path
* @return {Promise<ListDirectoryResult>}
*/
async listDirectory(bucketName, path) {
return await this.#parent.listDirectory(bucketName, path)
}
/**
* @param {string} bucketName
* @param {string} path

View file

@ -20,18 +20,6 @@ const { URL } = require('node:url')
const { WriteError, ReadError, NotFoundError } = require('./Errors')
const zlib = require('node:zlib')
/**
* @typedef {import('aws-sdk/clients/s3').ListObjectsV2Output} ListObjectsV2Output
*/
/**
* @typedef {import('aws-sdk/clients/s3').Object} S3Object
*/
/**
* @typedef {import('./types').ListDirectoryResult} ListDirectoryResult
*/
/**
* Wrapper with private fields to avoid revealing them on console, JSON.stringify or similar.
*/
@ -278,12 +266,26 @@ class S3Persistor extends AbstractPersistor {
* @return {Promise<void>}
*/
async deleteDirectory(bucketName, key, continuationToken) {
const { contents, response } = await this.listDirectory(
bucketName,
key,
continuationToken
)
const objects = contents.map(item => ({ Key: item.Key || '' }))
let response
const options = { Bucket: bucketName, Prefix: key }
if (continuationToken) {
options.ContinuationToken = continuationToken
}
try {
response = await this._getClientForBucket(bucketName)
.listObjectsV2(options)
.promise()
} catch (err) {
throw PersistorHelper.wrapError(
err,
'failed to list objects in S3',
{ bucketName, key },
ReadError
)
}
const objects = response.Contents?.map(item => ({ Key: item.Key || '' }))
if (objects?.length) {
try {
await this._getClientForBucket(bucketName)
@ -314,36 +316,6 @@ class S3Persistor extends AbstractPersistor {
}
}
/**
*
* @param {string} bucketName
* @param {string} key
* @param {string} [continuationToken]
* @return {Promise<ListDirectoryResult>}
*/
async listDirectory(bucketName, key, continuationToken) {
let response
const options = { Bucket: bucketName, Prefix: key }
if (continuationToken) {
options.ContinuationToken = continuationToken
}
try {
response = await this._getClientForBucket(bucketName)
.listObjectsV2(options)
.promise()
} catch (err) {
throw PersistorHelper.wrapError(
err,
'failed to list objects in S3',
{ bucketName, key },
ReadError
)
}
return { contents: response.Contents ?? [], response }
}
/**
* @param {string} bucketName
* @param {string} key

View file

@ -1,6 +0,0 @@
import type { ListObjectsV2Output, Object } from 'aws-sdk/clients/s3'
export type ListDirectoryResult = {
contents: Array<Object>
response: ListObjectsV2Output
}

View file

@ -0,0 +1 @@
node_modules/

View file

@ -0,0 +1,5 @@
/coverage
/node_modules
# managed by monorepo$ bin/update_build_scripts
.npmrc

View file

@ -1 +1 @@
22.17.0
20.18.2

View file

@ -1,10 +1,10 @@
overleaf-editor-core
--dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
--docker-repos=gcr.io/overleaf-ops
--env-add=
--env-pass-through=
--esmock-loader=False
--is-library=True
--node-version=22.17.0
--node-version=20.18.2
--public-repo=False
--script-version=4.7.0
--script-version=4.5.0

View file

@ -18,7 +18,6 @@ const MoveFileOperation = require('./lib/operation/move_file_operation')
const SetCommentStateOperation = require('./lib/operation/set_comment_state_operation')
const EditFileOperation = require('./lib/operation/edit_file_operation')
const EditNoOperation = require('./lib/operation/edit_no_operation')
const EditOperationTransformer = require('./lib/operation/edit_operation_transformer')
const SetFileMetadataOperation = require('./lib/operation/set_file_metadata_operation')
const NoOperation = require('./lib/operation/no_operation')
const Operation = require('./lib/operation')
@ -44,8 +43,6 @@ const TrackingProps = require('./lib/file_data/tracking_props')
const Range = require('./lib/range')
const CommentList = require('./lib/file_data/comment_list')
const LazyStringFileData = require('./lib/file_data/lazy_string_file_data')
const StringFileData = require('./lib/file_data/string_file_data')
const EditOperationBuilder = require('./lib/operation/edit_operation_builder')
exports.AddCommentOperation = AddCommentOperation
exports.Author = Author
@ -61,7 +58,6 @@ exports.DeleteCommentOperation = DeleteCommentOperation
exports.File = File
exports.FileMap = FileMap
exports.LazyStringFileData = LazyStringFileData
exports.StringFileData = StringFileData
exports.History = History
exports.Label = Label
exports.AddFileOperation = AddFileOperation
@ -69,8 +65,6 @@ exports.MoveFileOperation = MoveFileOperation
exports.SetCommentStateOperation = SetCommentStateOperation
exports.EditFileOperation = EditFileOperation
exports.EditNoOperation = EditNoOperation
exports.EditOperationBuilder = EditOperationBuilder
exports.EditOperationTransformer = EditOperationTransformer
exports.SetFileMetadataOperation = SetFileMetadataOperation
exports.NoOperation = NoOperation
exports.Operation = Operation

View file

@ -13,7 +13,7 @@ const V2DocVersions = require('./v2_doc_versions')
/**
* @import Author from "./author"
* @import { BlobStore, RawChange, ReadonlyBlobStore } from "./types"
* @import { BlobStore } from "./types"
*/
/**
@ -54,7 +54,7 @@ class Change {
/**
* For serialization.
*
* @return {RawChange}
* @return {Object}
*/
toRaw() {
function toRaw(object) {
@ -219,7 +219,7 @@ class Change {
* If this Change contains any File objects, load them.
*
* @param {string} kind see {File#load}
* @param {ReadonlyBlobStore} blobStore
* @param {BlobStore} blobStore
* @return {Promise<void>}
*/
async loadFiles(kind, blobStore) {

View file

@ -1,7 +1,7 @@
// @ts-check
/**
* @import { ClearTrackingPropsRawData, TrackingDirective } from '../types'
* @import { ClearTrackingPropsRawData } from '../types'
*/
class ClearTrackingProps {
@ -11,27 +11,12 @@ class ClearTrackingProps {
/**
* @param {any} other
* @returns {other is ClearTrackingProps}
* @returns {boolean}
*/
equals(other) {
return other instanceof ClearTrackingProps
}
/**
* @param {TrackingDirective} other
* @returns {other is ClearTrackingProps}
*/
canMergeWith(other) {
return other instanceof ClearTrackingProps
}
/**
* @param {TrackingDirective} other
*/
mergeWith(other) {
return this
}
/**
* @returns {ClearTrackingPropsRawData}
*/

View file

@ -11,7 +11,7 @@ const EditOperation = require('../operation/edit_operation')
const EditOperationBuilder = require('../operation/edit_operation_builder')
/**
* @import { BlobStore, ReadonlyBlobStore, RangesBlob, RawHashFileData, RawLazyStringFileData } from '../types'
* @import { BlobStore, ReadonlyBlobStore, RangesBlob, RawFileData, RawLazyStringFileData } from '../types'
*/
class LazyStringFileData extends FileData {
@ -159,11 +159,11 @@ class LazyStringFileData extends FileData {
/** @inheritdoc
* @param {BlobStore} blobStore
* @return {Promise<RawHashFileData>}
* @return {Promise<RawFileData>}
*/
async store(blobStore) {
if (this.operations.length === 0) {
/** @type RawHashFileData */
/** @type RawFileData */
const raw = { hash: this.hash }
if (this.rangesHash) {
raw.rangesHash = this.rangesHash
@ -171,11 +171,9 @@ class LazyStringFileData extends FileData {
return raw
}
const eager = await this.toEager(blobStore)
const raw = await eager.store(blobStore)
this.hash = raw.hash
this.rangesHash = raw.rangesHash
this.operations.length = 0
return raw
/** @type RawFileData */
return await eager.store(blobStore)
}
}

View file

@ -8,7 +8,7 @@ const CommentList = require('./comment_list')
const TrackedChangeList = require('./tracked_change_list')
/**
* @import { StringFileRawData, RawHashFileData, BlobStore, CommentRawData } from "../types"
* @import { StringFileRawData, RawFileData, BlobStore, CommentRawData } from "../types"
* @import { TrackedChangeRawData, RangesBlob } from "../types"
* @import EditOperation from "../operation/edit_operation"
*/
@ -88,14 +88,6 @@ class StringFileData extends FileData {
return content
}
/**
* Return docstore view of a doc: each line separated
* @return {string[]}
*/
getLines() {
return this.getContent({ filterTrackedDeletes: true }).split('\n')
}
/** @inheritdoc */
getByteLength() {
return Buffer.byteLength(this.content)
@ -139,7 +131,7 @@ class StringFileData extends FileData {
/**
* @inheritdoc
* @param {BlobStore} blobStore
* @return {Promise<RawHashFileData>}
* @return {Promise<RawFileData>}
*/
async store(blobStore) {
const blob = await blobStore.putString(this.content)

View file

@ -84,21 +84,6 @@ class TrackedChange {
)
)
}
/**
* Return an equivalent tracked change whose extent is limited to the given
* range
*
* @param {Range} range
* @returns {TrackedChange | null} - the result or null if the intersection is empty
*/
intersectRange(range) {
const intersection = this.range.intersect(range)
if (intersection == null) {
return null
}
return new TrackedChange(intersection, this.tracking)
}
}
module.exports = TrackedChange

View file

@ -2,11 +2,9 @@
const Range = require('../range')
const TrackedChange = require('./tracked_change')
const TrackingProps = require('../file_data/tracking_props')
const { InsertOp, RemoveOp, RetainOp } = require('../operation/scan_op')
/**
* @import { TrackingDirective, TrackedChangeRawData } from "../types"
* @import TextOperation from "../operation/text_operation"
*/
class TrackedChangeList {
@ -60,22 +58,6 @@ class TrackedChangeList {
return this._trackedChanges.filter(change => range.contains(change.range))
}
/**
* Returns tracked changes that overlap with the given range
* @param {Range} range
* @returns {TrackedChange[]}
*/
intersectRange(range) {
const changes = []
for (const change of this._trackedChanges) {
const intersection = change.intersectRange(range)
if (intersection != null) {
changes.push(intersection)
}
}
return changes
}
/**
* Returns the tracking props for a given range.
* @param {Range} range
@ -107,8 +89,6 @@ class TrackedChangeList {
/**
* Collapses consecutive (and compatible) ranges
*
* @private
* @returns {void}
*/
_mergeRanges() {
@ -137,28 +117,12 @@ class TrackedChangeList {
}
/**
* Apply an insert operation
*
* @param {number} cursor
* @param {string} insertedText
* @param {{tracking?: TrackingProps}} opts
*/
applyInsert(cursor, insertedText, opts = {}) {
this._applyInsert(cursor, insertedText, opts)
this._mergeRanges()
}
/**
* Apply an insert operation
*
* This method will not merge ranges at the end
*
* @private
* @param {number} cursor
* @param {string} insertedText
* @param {{tracking?: TrackingProps}} [opts]
*/
_applyInsert(cursor, insertedText, opts = {}) {
const newTrackedChanges = []
for (const trackedChange of this._trackedChanges) {
if (
@ -207,29 +171,15 @@ class TrackedChangeList {
newTrackedChanges.push(newTrackedChange)
}
this._trackedChanges = newTrackedChanges
this._mergeRanges()
}
/**
* Apply a delete operation to the list of tracked changes
*
* @param {number} cursor
* @param {number} length
*/
applyDelete(cursor, length) {
this._applyDelete(cursor, length)
this._mergeRanges()
}
/**
* Apply a delete operation to the list of tracked changes
*
* This method will not merge ranges at the end
*
* @private
* @param {number} cursor
* @param {number} length
*/
_applyDelete(cursor, length) {
const newTrackedChanges = []
for (const trackedChange of this._trackedChanges) {
const deletedRange = new Range(cursor, length)
@ -255,31 +205,15 @@ class TrackedChangeList {
}
}
this._trackedChanges = newTrackedChanges
}
/**
* Apply a retain operation to the list of tracked changes
*
* @param {number} cursor
* @param {number} length
* @param {{tracking?: TrackingDirective}} [opts]
*/
applyRetain(cursor, length, opts = {}) {
this._applyRetain(cursor, length, opts)
this._mergeRanges()
}
/**
* Apply a retain operation to the list of tracked changes
*
* This method will not merge ranges at the end
*
* @private
* @param {number} cursor
* @param {number} length
* @param {{tracking?: TrackingDirective}} opts
*/
_applyRetain(cursor, length, opts = {}) {
applyRetain(cursor, length, opts = {}) {
// If there's no tracking info, leave everything as-is
if (!opts.tracking) {
return
@ -335,31 +269,6 @@ class TrackedChangeList {
newTrackedChanges.push(newTrackedChange)
}
this._trackedChanges = newTrackedChanges
}
/**
* Apply a text operation to the list of tracked changes
*
* Ranges are merged only once at the end, for performance and to avoid
* problematic edge cases where intermediate ranges get incorrectly merged.
*
* @param {TextOperation} operation
*/
applyTextOperation(operation) {
// this cursor tracks the destination document that gets modified as
// operations are applied to it.
let cursor = 0
for (const op of operation.ops) {
if (op instanceof InsertOp) {
this._applyInsert(cursor, op.insertion, { tracking: op.tracking })
cursor += op.insertion.length
} else if (op instanceof RemoveOp) {
this._applyDelete(cursor, op.length)
} else if (op instanceof RetainOp) {
this._applyRetain(cursor, op.length, { tracking: op.tracking })
cursor += op.length
}
}
this._mergeRanges()
}
}

View file

@ -62,35 +62,6 @@ class TrackingProps {
this.ts.getTime() === other.ts.getTime()
)
}
/**
* Are these tracking props compatible with the other tracking props for merging
* ranges?
*
* @param {TrackingDirective} other
* @returns {other is TrackingProps}
*/
canMergeWith(other) {
if (!(other instanceof TrackingProps)) {
return false
}
return this.type === other.type && this.userId === other.userId
}
/**
* Merge two tracking props
*
* Assumes that `canMerge(other)` returns true
*
* @param {TrackingDirective} other
*/
mergeWith(other) {
if (!this.canMergeWith(other)) {
throw new Error('Cannot merge with incompatible tracking props')
}
const ts = this.ts <= other.ts ? this.ts : other.ts
return new TrackingProps(this.type, this.userId, ts)
}
}
module.exports = TrackingProps

View file

@ -7,7 +7,7 @@ const Change = require('./change')
const Snapshot = require('./snapshot')
/**
* @import { BlobStore, ReadonlyBlobStore } from "./types"
* @import { BlobStore } from "./types"
*/
class History {
@ -85,7 +85,7 @@ class History {
* If this History contains any File objects, load them.
*
* @param {string} kind see {File#load}
* @param {ReadonlyBlobStore} blobStore
* @param {BlobStore} blobStore
* @return {Promise<void>}
*/
async loadFiles(kind, blobStore) {

View file

@ -36,20 +36,6 @@ class EditOperationBuilder {
}
throw new Error('Unsupported operation in EditOperationBuilder.fromJSON')
}
/**
* @param {unknown} raw
* @return {raw is RawEditOperation}
*/
static isValid(raw) {
return (
isTextOperation(raw) ||
isRawAddCommentOperation(raw) ||
isRawDeleteCommentOperation(raw) ||
isRawSetCommentStateOperation(raw) ||
isRawEditNoOperation(raw)
)
}
}
/**

View file

@ -13,7 +13,7 @@ let EditFileOperation = null
let SetFileMetadataOperation = null
/**
* @import { ReadonlyBlobStore } from "../types"
* @import { BlobStore } from "../types"
* @import Snapshot from "../snapshot"
*/
@ -80,7 +80,7 @@ class Operation {
* If this operation references any files, load the files.
*
* @param {string} kind see {File#load}
* @param {ReadOnlyBlobStore} blobStore
* @param {BlobStore} blobStore
* @return {Promise<void>}
*/
async loadFiles(kind, blobStore) {}

View file

@ -175,7 +175,7 @@ class InsertOp extends ScanOp {
return false
}
if (this.tracking) {
if (!other.tracking || !this.tracking.canMergeWith(other.tracking)) {
if (!this.tracking.equals(other.tracking)) {
return false
}
} else if (other.tracking) {
@ -198,10 +198,7 @@ class InsertOp extends ScanOp {
throw new Error('Cannot merge with incompatible operation')
}
this.insertion += other.insertion
if (this.tracking != null && other.tracking != null) {
this.tracking = this.tracking.mergeWith(other.tracking)
}
// We already have the same commentIds
// We already have the same tracking info and commentIds
}
/**
@ -309,13 +306,9 @@ class RetainOp extends ScanOp {
return false
}
if (this.tracking) {
if (!other.tracking || !this.tracking.canMergeWith(other.tracking)) {
return false
}
} else if (other.tracking) {
return false
return this.tracking.equals(other.tracking)
}
return true
return !other.tracking
}
/**
@ -326,9 +319,6 @@ class RetainOp extends ScanOp {
throw new Error('Cannot merge with incompatible operation')
}
this.length += other.length
if (this.tracking != null && other.tracking != null) {
this.tracking = this.tracking.mergeWith(other.tracking)
}
}
/**

View file

@ -314,18 +314,25 @@ class TextOperation extends EditOperation {
str
)
}
file.trackedChanges.applyRetain(result.length, op.length, {
tracking: op.tracking,
})
result += str.slice(inputCursor, inputCursor + op.length)
inputCursor += op.length
} else if (op instanceof InsertOp) {
if (containsNonBmpChars(op.insertion)) {
throw new InvalidInsertionError(str, op.toJSON())
}
file.trackedChanges.applyInsert(result.length, op.insertion, {
tracking: op.tracking,
})
file.comments.applyInsert(
new Range(result.length, op.insertion.length),
{ commentIds: op.commentIds }
)
result += op.insertion
} else if (op instanceof RemoveOp) {
file.trackedChanges.applyDelete(result.length, op.length)
file.comments.applyDelete(new Range(result.length, op.length))
inputCursor += op.length
} else {
@ -345,8 +352,6 @@ class TextOperation extends EditOperation {
throw new TextOperation.TooLongError(operation, result.length)
}
file.trackedChanges.applyTextOperation(this)
file.content = result
}
@ -395,36 +400,44 @@ class TextOperation extends EditOperation {
for (let i = 0, l = ops.length; i < l; i++) {
const op = ops[i]
if (op instanceof RetainOp) {
if (op.tracking) {
// Where we need to end up after the retains
const target = strIndex + op.length
// A previous retain could have overriden some tracking info. Now we
// need to restore it.
const previousChanges = previousState.trackedChanges.intersectRange(
new Range(strIndex, op.length)
)
// Where we need to end up after the retains
const target = strIndex + op.length
// A previous retain could have overriden some tracking info. Now we
// need to restore it.
const previousRanges = previousState.trackedChanges.inRange(
new Range(strIndex, op.length)
)
for (const change of previousChanges) {
if (strIndex < change.range.start) {
inverse.retain(change.range.start - strIndex, {
tracking: new ClearTrackingProps(),
})
strIndex = change.range.start
}
inverse.retain(change.range.length, {
tracking: change.tracking,
let removeTrackingInfoIfNeeded
if (op.tracking) {
removeTrackingInfoIfNeeded = new ClearTrackingProps()
}
for (const trackedChange of previousRanges) {
if (strIndex < trackedChange.range.start) {
inverse.retain(trackedChange.range.start - strIndex, {
tracking: removeTrackingInfoIfNeeded,
})
strIndex += change.range.length
strIndex = trackedChange.range.start
}
if (strIndex < target) {
inverse.retain(target - strIndex, {
tracking: new ClearTrackingProps(),
if (trackedChange.range.end < strIndex + op.length) {
inverse.retain(trackedChange.range.length, {
tracking: trackedChange.tracking,
})
strIndex = target
strIndex = trackedChange.range.end
}
} else {
inverse.retain(op.length)
strIndex += op.length
if (trackedChange.range.end !== strIndex) {
// No need to split the range at the end
const [left] = trackedChange.range.splitAt(strIndex)
inverse.retain(left.length, { tracking: trackedChange.tracking })
strIndex = left.end
}
}
if (strIndex < target) {
inverse.retain(target - strIndex, {
tracking: removeTrackingInfoIfNeeded,
})
strIndex = target
}
} else if (op instanceof InsertOp) {
inverse.remove(op.insertion.length)

View file

@ -86,32 +86,10 @@ class Range {
}
/**
* Does this range overlap another range?
*
* Overlapping means that the two ranges have at least one character in common
*
* @param {Range} other - the other range
* @param {Range} range
*/
overlaps(other) {
return this.start < other.end && this.end > other.start
}
/**
* Does this range overlap the start of another range?
*
* @param {Range} other - the other range
*/
overlapsStart(other) {
return this.start <= other.start && this.end > other.start
}
/**
* Does this range overlap the end of another range?
*
* @param {Range} other - the other range
*/
overlapsEnd(other) {
return this.start < other.end && this.end >= other.end
overlaps(range) {
return this.start < range.end && this.end > range.start
}
/**
@ -249,26 +227,6 @@ class Range {
)
return [rangeUpToCursor, rangeAfterCursor]
}
/**
* Returns the intersection of this range with another range
*
* @param {Range} other - the other range
* @return {Range | null} the intersection or null if the intersection is empty
*/
intersect(other) {
if (this.contains(other)) {
return other
} else if (other.contains(this)) {
return this
} else if (other.overlapsStart(this)) {
return new Range(this.pos, other.end - this.start)
} else if (other.overlapsEnd(this)) {
return new Range(other.pos, this.end - other.start)
} else {
return null
}
}
}
module.exports = Range

View file

@ -193,13 +193,4 @@ describe('LazyStringFileData', function () {
expect(fileData.getStringLength()).to.equal(longString.length)
expect(fileData.getOperations()).to.have.length(1)
})
it('truncates its operations after being stored', async function () {
const testHash = File.EMPTY_FILE_HASH
const fileData = new LazyStringFileData(testHash, undefined, 0)
fileData.edit(new TextOperation().insert('abc'))
const stored = await fileData.store(this.blobStore)
expect(fileData.hash).to.equal(stored.hash)
expect(fileData.operations).to.deep.equal([])
})
})

View file

@ -1,3 +1,4 @@
// @ts-check
'use strict'
const { expect } = require('chai')
@ -448,44 +449,4 @@ describe('Range', function () {
expect(() => range.insertAt(16, 3)).to.throw()
})
})
describe('intersect', function () {
it('should handle partially overlapping ranges', function () {
const range1 = new Range(5, 10)
const range2 = new Range(3, 6)
const intersection1 = range1.intersect(range2)
expect(intersection1.pos).to.equal(5)
expect(intersection1.length).to.equal(4)
const intersection2 = range2.intersect(range1)
expect(intersection2.pos).to.equal(5)
expect(intersection2.length).to.equal(4)
})
it('should intersect with itself', function () {
const range = new Range(5, 10)
const intersection = range.intersect(range)
expect(intersection.pos).to.equal(5)
expect(intersection.length).to.equal(10)
})
it('should handle nested ranges', function () {
const range1 = new Range(5, 10)
const range2 = new Range(7, 2)
const intersection1 = range1.intersect(range2)
expect(intersection1.pos).to.equal(7)
expect(intersection1.length).to.equal(2)
const intersection2 = range2.intersect(range1)
expect(intersection2.pos).to.equal(7)
expect(intersection2.length).to.equal(2)
})
it('should handle disconnected ranges', function () {
const range1 = new Range(5, 10)
const range2 = new Range(20, 30)
const intersection1 = range1.intersect(range2)
expect(intersection1).to.be.null
const intersection2 = range2.intersect(range1)
expect(intersection2).to.be.null
})
})
})

View file

@ -107,7 +107,7 @@ describe('RetainOp', function () {
expect(op1.equals(new RetainOp(3))).to.be.true
})
it('cannot merge with another RetainOp if the tracking user is different', function () {
it('cannot merge with another RetainOp if tracking info is different', function () {
const op1 = new RetainOp(
4,
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
@ -120,14 +120,14 @@ describe('RetainOp', function () {
expect(() => op1.mergeWith(op2)).to.throw(Error)
})
it('can merge with another RetainOp if the tracking user is the same', function () {
it('can merge with another RetainOp if tracking info is the same', function () {
const op1 = new RetainOp(
4,
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
)
const op2 = new RetainOp(
4,
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:01.000Z'))
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
)
op1.mergeWith(op2)
expect(
@ -310,7 +310,7 @@ describe('InsertOp', function () {
expect(() => op1.mergeWith(op2)).to.throw(Error)
})
it('cannot merge with another InsertOp if tracking user is different', function () {
it('cannot merge with another InsertOp if tracking info is different', function () {
const op1 = new InsertOp(
'a',
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
@ -323,7 +323,7 @@ describe('InsertOp', function () {
expect(() => op1.mergeWith(op2)).to.throw(Error)
})
it('can merge with another InsertOp if tracking user and comment info is the same', function () {
it('can merge with another InsertOp if tracking and comment info is the same', function () {
const op1 = new InsertOp(
'a',
new TrackingProps(
@ -338,7 +338,7 @@ describe('InsertOp', function () {
new TrackingProps(
'insert',
'user1',
new Date('2024-01-01T00:00:01.000Z')
new Date('2024-01-01T00:00:00.000Z')
),
['1', '2']
)

View file

@ -322,47 +322,6 @@ describe('TextOperation', function () {
new TextOperation().retain(4).remove(4).retain(3)
)
})
it('undoing a tracked delete restores the tracked changes', function () {
expectInverseToLeadToInitialState(
new StringFileData(
'the quick brown fox jumps over the lazy dog',
undefined,
[
{
range: { pos: 5, length: 5 },
tracking: {
ts: '2023-01-01T00:00:00.000Z',
type: 'insert',
userId: 'user1',
},
},
{
range: { pos: 12, length: 3 },
tracking: {
ts: '2023-01-01T00:00:00.000Z',
type: 'delete',
userId: 'user1',
},
},
{
range: { pos: 18, length: 5 },
tracking: {
ts: '2023-01-01T00:00:00.000Z',
type: 'insert',
userId: 'user1',
},
},
]
),
new TextOperation()
.retain(7)
.retain(13, {
tracking: new TrackingProps('delete', 'user1', new Date()),
})
.retain(23)
)
})
})
describe('compose', function () {

View file

@ -0,0 +1 @@
node_modules/

3
libraries/promise-utils/.gitignore vendored Normal file
View file

@ -0,0 +1,3 @@
# managed by monorepo$ bin/update_build_scripts
.npmrc

View file

@ -1 +1 @@
22.17.0
20.18.2

View file

@ -1,10 +1,10 @@
promise-utils
--dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
--docker-repos=gcr.io/overleaf-ops
--env-add=
--env-pass-through=
--esmock-loader=False
--is-library=True
--node-version=22.17.0
--node-version=20.18.2
--public-repo=False
--script-version=4.7.0
--script-version=4.5.0

View file

@ -0,0 +1 @@
node_modules/

13
libraries/ranges-tracker/.gitignore vendored Normal file
View file

@ -0,0 +1,13 @@
**.swp
app.js
app/js/
test/unit/js/
public/build/
node_modules/
/public/js/chat.js
plato/
.npmrc

View file

@ -1 +1 @@
22.17.0
20.18.2

View file

@ -1,10 +1,10 @@
ranges-tracker
--dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
--docker-repos=gcr.io/overleaf-ops
--env-add=
--env-pass-through=
--esmock-loader=False
--is-library=True
--node-version=22.17.0
--node-version=20.18.2
--public-repo=False
--script-version=4.7.0
--script-version=4.5.0

View file

@ -0,0 +1 @@
node_modules/

13
libraries/redis-wrapper/.gitignore vendored Normal file
View file

@ -0,0 +1,13 @@
**.swp
app.js
app/js/
test/unit/js/
public/build/
node_modules/
/public/js/chat.js
plato/
.npmrc

View file

@ -1 +1 @@
22.17.0
20.18.2

View file

@ -97,8 +97,7 @@ module.exports = class RedisLocker {
}
/**
* @param {string} id
* @param {function(Error, boolean, string): void} callback
* @param {Callback} callback
*/
tryLock(id, callback) {
if (callback == null) {
@ -107,7 +106,7 @@ module.exports = class RedisLocker {
const lockValue = this.randomLock()
const key = this.getKey(id)
const startTime = Date.now()
this.rclient.set(
return this.rclient.set(
key,
lockValue,
'EX',
@ -122,7 +121,7 @@ module.exports = class RedisLocker {
const timeTaken = Date.now() - startTime
if (timeTaken > MAX_REDIS_REQUEST_LENGTH) {
// took too long, so try to free the lock
this.releaseLock(id, lockValue, function (err, result) {
return this.releaseLock(id, lockValue, function (err, result) {
if (err != null) {
return callback(err)
} // error freeing lock
@ -140,8 +139,7 @@ module.exports = class RedisLocker {
}
/**
* @param {string} id
* @param {function(Error, string): void} callback
* @param {Callback} callback
*/
getLock(id, callback) {
if (callback == null) {
@ -155,7 +153,7 @@ module.exports = class RedisLocker {
return callback(e)
}
this.tryLock(id, (error, gotLock, lockValue) => {
return this.tryLock(id, (error, gotLock, lockValue) => {
if (error != null) {
return callback(error)
}
@ -175,15 +173,14 @@ module.exports = class RedisLocker {
}
/**
* @param {string} id
* @param {function(Error, boolean): void} callback
* @param {Callback} callback
*/
checkLock(id, callback) {
if (callback == null) {
callback = function () {}
}
const key = this.getKey(id)
this.rclient.exists(key, (err, exists) => {
return this.rclient.exists(key, (err, exists) => {
if (err != null) {
return callback(err)
}
@ -199,26 +196,30 @@ module.exports = class RedisLocker {
}
/**
* @param {string} id
* @param {string} lockValue
* @param {function(Error, boolean): void} callback
* @param {Callback} callback
*/
releaseLock(id, lockValue, callback) {
const key = this.getKey(id)
this.rclient.eval(UNLOCK_SCRIPT, 1, key, lockValue, (err, result) => {
if (err != null) {
return callback(err)
} else if (result != null && result !== 1) {
// successful unlock should release exactly one key
logger.error(
{ id, key, lockValue, redis_err: err, redis_result: result },
'unlocking error'
)
metrics.inc(this.metricsPrefix + '-unlock-error')
return callback(new Error('tried to release timed out lock'))
} else {
return callback(null, result)
return this.rclient.eval(
UNLOCK_SCRIPT,
1,
key,
lockValue,
(err, result) => {
if (err != null) {
return callback(err)
} else if (result != null && result !== 1) {
// successful unlock should release exactly one key
logger.error(
{ id, key, lockValue, redis_err: err, redis_result: result },
'unlocking error'
)
metrics.inc(this.metricsPrefix + '-unlock-error')
return callback(new Error('tried to release timed out lock'))
} else {
return callback(null, result)
}
}
})
)
}
}

View file

@ -1,10 +1,10 @@
redis-wrapper
--dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
--docker-repos=gcr.io/overleaf-ops
--env-add=
--env-pass-through=
--esmock-loader=False
--is-library=True
--node-version=22.17.0
--node-version=20.18.2
--public-repo=False
--script-version=4.7.0
--script-version=4.5.0

View file

@ -0,0 +1 @@
node_modules/

5
libraries/settings/.gitignore vendored Normal file
View file

@ -0,0 +1,5 @@
/.npmrc
/node_modules
# managed by monorepo$ bin/update_build_scripts
.npmrc

View file

@ -1 +1 @@
22.17.0
20.18.2

View file

@ -1,10 +1,10 @@
settings
--dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
--docker-repos=gcr.io/overleaf-ops
--env-add=
--env-pass-through=
--esmock-loader=False
--is-library=True
--node-version=22.17.0
--node-version=20.18.2
--public-repo=False
--script-version=4.7.0
--script-version=4.5.0

View file

@ -0,0 +1 @@
node_modules/

3
libraries/stream-utils/.gitignore vendored Normal file
View file

@ -0,0 +1,3 @@
# managed by monorepo$ bin/update_build_scripts
.npmrc

View file

@ -1 +1 @@
22.17.0
20.18.2

View file

@ -1,10 +1,10 @@
stream-utils
--dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
--docker-repos=gcr.io/overleaf-ops
--env-add=
--env-pass-through=
--esmock-loader=False
--is-library=True
--node-version=22.17.0
--node-version=20.18.2
--public-repo=False
--script-version=4.7.0
--script-version=4.5.0

View file

@ -145,24 +145,6 @@ class LoggerStream extends Transform {
}
}
class MeteredStream extends Transform {
#Metrics
#metric
#labels
constructor(Metrics, metric, labels) {
super()
this.#Metrics = Metrics
this.#metric = metric
this.#labels = labels
}
_transform(chunk, encoding, callback) {
this.#Metrics.count(this.#metric, chunk.byteLength, 1, this.#labels)
callback(null, chunk)
}
}
// Export our classes
module.exports = {
@ -171,7 +153,6 @@ module.exports = {
LoggerStream,
LimitedStream,
TimeoutStream,
MeteredStream,
SizeExceededError,
AbortError,
}

11281
package-lock.json generated

File diff suppressed because it is too large Load diff

View file

@ -8,8 +8,8 @@
"@types/chai": "^4.3.0",
"@types/chai-as-promised": "^7.1.8",
"@types/mocha": "^10.0.6",
"@typescript-eslint/eslint-plugin": "^8.30.1",
"@typescript-eslint/parser": "^8.30.1",
"@typescript-eslint/eslint-plugin": "^8.0.0",
"@typescript-eslint/parser": "^8.0.0",
"eslint": "^8.15.0",
"eslint-config-prettier": "^8.5.0",
"eslint-config-standard": "^17.0.0",
@ -18,24 +18,28 @@
"eslint-plugin-cypress": "^2.15.1",
"eslint-plugin-import": "^2.26.0",
"eslint-plugin-mocha": "^10.1.0",
"eslint-plugin-n": "^15.7.0",
"eslint-plugin-node": "^11.1.0",
"eslint-plugin-prettier": "^4.0.0",
"eslint-plugin-promise": "^6.0.0",
"eslint-plugin-unicorn": "^56.0.0",
"prettier": "3.6.2",
"typescript": "^5.8.3"
},
"engines": {
"npm": "11.4.2"
"prettier": "3.3.3",
"typescript": "^5.5.4"
},
"overrides": {
"swagger-tools@0.10.4": {
"path-to-regexp": "3.3.0",
"body-parser": "1.20.3",
"multer": "2.0.1"
"cross-env": {
"cross-spawn": "^7.0.6"
},
"request@2.88.2": {
"tough-cookie": "5.1.2"
"fetch-mock": {
"path-to-regexp": "3.3.0"
},
"google-gax": {
"protobufjs": "^7.2.5"
},
"swagger-tools": {
"body-parser": "1.20.3",
"multer": "1.4.5-lts.1",
"path-to-regexp": "3.3.0",
"qs": "6.13.0"
}
},
"scripts": {
@ -51,7 +55,6 @@
"services/analytics",
"services/chat",
"services/clsi",
"services/clsi-cache",
"services/clsi-perf",
"services/contacts",
"services/docstore",

View file

@ -1,23 +0,0 @@
diff --git a/node_modules/@node-saml/node-saml/lib/saml.js b/node_modules/@node-saml/node-saml/lib/saml.js
index fba15b9..a5778cb 100644
--- a/node_modules/@node-saml/node-saml/lib/saml.js
+++ b/node_modules/@node-saml/node-saml/lib/saml.js
@@ -336,7 +336,8 @@ class SAML {
const requestOrResponse = request || response;
(0, utility_1.assertRequired)(requestOrResponse, "either request or response is required");
let buffer;
- if (this.options.skipRequestCompression) {
+ // logout requestOrResponse must be compressed anyway
+ if (this.options.skipRequestCompression && operation !== "logout") {
buffer = Buffer.from(requestOrResponse, "utf8");
}
else {
@@ -495,7 +496,7 @@ class SAML {
try {
xml = Buffer.from(container.SAMLResponse, "base64").toString("utf8");
doc = await (0, xml_1.parseDomFromString)(xml);
- const inResponseToNodes = xml_1.xpath.selectAttributes(doc, "/*[local-name()='Response']/@InResponseTo");
+ const inResponseToNodes = xml_1.xpath.selectAttributes(doc, "/*[local-name()='Response' or local-name()='LogoutResponse']/@InResponseTo");
if (inResponseToNodes) {
inResponseTo = inResponseToNodes.length ? inResponseToNodes[0].nodeValue : null;
await this.validateInResponseTo(inResponseTo);

View file

@ -1,64 +0,0 @@
diff --git a/node_modules/ldapauth-fork/lib/ldapauth.js b/node_modules/ldapauth-fork/lib/ldapauth.js
index 85ecf36a8b..a7d07e0f78 100644
--- a/node_modules/ldapauth-fork/lib/ldapauth.js
+++ b/node_modules/ldapauth-fork/lib/ldapauth.js
@@ -69,6 +69,7 @@ function LdapAuth(opts) {
this.opts.bindProperty || (this.opts.bindProperty = 'dn');
this.opts.groupSearchScope || (this.opts.groupSearchScope = 'sub');
this.opts.groupDnProperty || (this.opts.groupDnProperty = 'dn');
+ this.opts.tlsStarted = false;
EventEmitter.call(this);
@@ -108,21 +109,7 @@ function LdapAuth(opts) {
this._userClient.on('error', this._handleError.bind(this));
var self = this;
- if (this.opts.starttls) {
- // When starttls is enabled, this callback supplants the 'connect' callback
- this._adminClient.starttls(this.opts.tlsOptions, this._adminClient.controls, function(err) {
- if (err) {
- self._handleError(err);
- } else {
- self._onConnectAdmin();
- }
- });
- this._userClient.starttls(this.opts.tlsOptions, this._userClient.controls, function(err) {
- if (err) {
- self._handleError(err);
- }
- });
- } else if (opts.reconnect) {
+ if (opts.reconnect && !this.opts.starttls) {
this.once('_installReconnectListener', function() {
self.log && self.log.trace('install reconnect listener');
self._adminClient.on('connect', function() {
@@ -384,6 +371,28 @@ LdapAuth.prototype._findGroups = function(user, callback) {
*/
LdapAuth.prototype.authenticate = function(username, password, callback) {
var self = this;
+ if (this.opts.starttls && !this.opts.tlsStarted) {
+ // When starttls is enabled, this callback supplants the 'connect' callback
+ this._adminClient.starttls(this.opts.tlsOptions, this._adminClient.controls, function (err) {
+ if (err) {
+ self._handleError(err);
+ } else {
+ self._onConnectAdmin(function(){self._handleAuthenticate(username, password, callback);});
+ }
+ });
+ this._userClient.starttls(this.opts.tlsOptions, this._userClient.controls, function (err) {
+ if (err) {
+ self._handleError(err);
+ }
+ });
+ } else {
+ self._handleAuthenticate(username, password, callback);
+ }
+};
+
+LdapAuth.prototype._handleAuthenticate = function (username, password, callback) {
+ this.opts.tlsStarted = true;
+ var self = this;
if (typeof password === 'undefined' || password === null || password === '') {
return callback(new Error('no password given'));

View file

@ -1,22 +0,0 @@
diff --git a/node_modules/pdfjs-dist/build/pdf.worker.mjs b/node_modules/pdfjs-dist/build/pdf.worker.mjs
index 6c5c6f1..bb6b7d1 100644
--- a/node_modules/pdfjs-dist/build/pdf.worker.mjs
+++ b/node_modules/pdfjs-dist/build/pdf.worker.mjs
@@ -1830,7 +1830,7 @@ async function __wbg_init(module_or_path) {
}
}
if (typeof module_or_path === 'undefined') {
- module_or_path = new URL('qcms_bg.wasm', import.meta.url);
+ module_or_path = new URL(/* webpackIgnore: true */ 'qcms_bg.wasm', import.meta.url);
}
const imports = __wbg_get_imports();
if (typeof module_or_path === 'string' || typeof Request === 'function' && module_or_path instanceof Request || typeof URL === 'function' && module_or_path instanceof URL) {
@@ -5358,7 +5358,7 @@ var OpenJPEG = (() => {
if (Module["locateFile"]) {
return locateFile("openjpeg.wasm");
}
- return new URL("openjpeg.wasm", import.meta.url).href;
+ return new URL(/* webpackIgnore: true */ "openjpeg.wasm", import.meta.url).href;
}
function getBinarySync(file) {
if (file == wasmBinaryFile && wasmBinary) {

View file

@ -115,3 +115,9 @@ ENV LOG_LEVEL="info"
EXPOSE 80
ENTRYPOINT ["/sbin/my_init"]
# Store the revision
# ------------------
# This should be the last step to optimize docker image caching.
ARG MONOREPO_REVISION
RUN echo "monorepo-server-ce,$MONOREPO_REVISION" > /var/www/revisions.txt

View file

@ -2,7 +2,7 @@
# Overleaf Base Image (sharelatex/sharelatex-base)
# --------------------------------------------------
FROM phusion/baseimage:noble-1.0.2
FROM phusion/baseimage:noble-1.0.0
# Makes sure LuaTex cache is writable
# -----------------------------------
@ -10,7 +10,7 @@ ENV TEXMFVAR=/var/lib/overleaf/tmp/texmf-var
# Update to ensure dependencies are updated
# ------------------------------------------
ENV REBUILT_AFTER="2025-05-19"
ENV REBUILT_AFTER="2025-03-27"
# Install dependencies
# --------------------
@ -30,7 +30,7 @@ RUN --mount=type=cache,target=/var/cache/apt,sharing=locked \
# install Node.js https://github.com/nodesource/distributions#nodejs
&& mkdir -p /etc/apt/keyrings \
&& curl -fsSL https://deb.nodesource.com/gpgkey/nodesource-repo.gpg.key | gpg --dearmor -o /etc/apt/keyrings/nodesource.gpg \
&& echo "deb [signed-by=/etc/apt/keyrings/nodesource.gpg] https://deb.nodesource.com/node_22.x nodistro main" | tee /etc/apt/sources.list.d/nodesource.list \
&& echo "deb [signed-by=/etc/apt/keyrings/nodesource.gpg] https://deb.nodesource.com/node_20.x nodistro main" | tee /etc/apt/sources.list.d/nodesource.list \
&& apt-get update \
&& apt-get install -y nodejs \
\

Some files were not shown because too many files have changed in this diff Show more