Compare commits

..

No commits in common. "ext-ce" and "v5.3.1" have entirely different histories.

3101 changed files with 130001 additions and 154787 deletions

View file

@ -1,19 +1,10 @@
---
name: Bug report
about: Report a bug
title: ''
labels: type:bug
assignees: ''
---
<!-- <!--
Note: If you are using www.overleaf.com and have a problem, Note: If you are using www.overleaf.com and have a problem,
or if you would like to request a new feature please contact or if you would like to request a new feature please contact
the support team at support@overleaf.com the support team at support@overleaf.com
This form should only be used to report bugs in the This form should only be used to report bugs in the
Community Edition release of Overleaf. Community Edition release of Overleaf.
--> -->

View file

@ -14,52 +14,39 @@
<a href="#license">License</a> <a href="#license">License</a>
</p> </p>
<img src="doc/screenshot.png" alt="A screenshot of a project being edited in Overleaf Extended Community Edition"> <img src="doc/screenshot.png" alt="A screenshot of a project being edited in Overleaf Community Edition">
<p align="center"> <p align="center">
Figure 1: A screenshot of a project being edited in Overleaf Extended Community Edition. Figure 1: A screenshot of a project being edited in Overleaf Community Edition.
</p> </p>
## Community Edition ## Community Edition
[Overleaf](https://www.overleaf.com) is an open-source online real-time collaborative LaTeX editor. Overleaf runs a hosted version at [www.overleaf.com](https://www.overleaf.com), but you can also run your own local version, and contribute to the development of Overleaf. [Overleaf](https://www.overleaf.com) is an open-source online real-time collaborative LaTeX editor. We run a hosted version at [www.overleaf.com](https://www.overleaf.com), but you can also run your own local version, and contribute to the development of Overleaf.
## Extended Community Edition
The present "extended" version of Overleaf CE includes:
- Template Gallery
- Sandboxed Compiles with TeX Live image selection
- LDAP authentication
- SAML authentication
- OpenID Connect authentication
- Real-time track changes and comments
- Autocomplete of reference keys
- Symbol Palette
- "From External URL" feature
> [!CAUTION]
> Overleaf Community Edition is intended for use in environments where **all** users are trusted. Community Edition is **not** appropriate for scenarios where isolation of users is required due to Sandbox Compiles not being available. When not using Sandboxed Compiles, users have full read and write access to the `sharelatex` container resources (filesystem, network, environment variables) when running LaTeX compiles.
Therefore, in any environment where not all users can be fully trusted, it is strongly recommended to enable the Sandboxed Compiles feature available in the Extended Community Edition.
For more information on Sandbox Compiles check out Overleaf [documentation](https://docs.overleaf.com/on-premises/configuration/overleaf-toolkit/server-pro-only-configuration/sandboxed-compiles).
## Enterprise ## Enterprise
If you want help installing and maintaining Overleaf in your lab or workplace, Overleaf offers an officially supported version called [Overleaf Server Pro](https://www.overleaf.com/for/enterprises). If you want help installing and maintaining Overleaf in your lab or workplace, we offer an officially supported version called [Overleaf Server Pro](https://www.overleaf.com/for/enterprises). It also includes more features for security (SSO with LDAP or SAML), administration and collaboration (e.g. tracked changes). [Find out more!](https://www.overleaf.com/for/enterprises)
## Keeping up to date
Sign up to the [mailing list](https://mailchi.mp/overleaf.com/community-edition-and-server-pro) to get updates on Overleaf releases and development.
## Installation ## Installation
Detailed installation instructions can be found in the [Overleaf Toolkit](https://github.com/overleaf/toolkit/). We have detailed installation instructions in the [Overleaf Toolkit](https://github.com/overleaf/toolkit/).
Configuration details and release history for the Extended Community Edition can be found on the [Extended CE Wiki Page](https://github.com/yu-i-i/overleaf-cep/wiki).
## Upgrading
If you are upgrading from a previous version of Overleaf, please see the [Release Notes section on the Wiki](https://github.com/overleaf/overleaf/wiki#release-notes) for all of the versions between your current version and the version you are upgrading to.
## Overleaf Docker Image ## Overleaf Docker Image
This repo contains two dockerfiles, [`Dockerfile-base`](server-ce/Dockerfile-base), which builds the This repo contains two dockerfiles, [`Dockerfile-base`](server-ce/Dockerfile-base), which builds the
`sharelatex/sharelatex-base:ext-ce` image, and [`Dockerfile`](server-ce/Dockerfile) which builds the `sharelatex/sharelatex-base` image, and [`Dockerfile`](server-ce/Dockerfile) which builds the
`sharelatex/sharelatex:ext-ce` image. `sharelatex/sharelatex` (or "community") image.
The Base image generally contains the basic dependencies like `wget`, plus `texlive`. The Base image generally contains the basic dependencies like `wget`, plus `texlive`.
This is split out because it's a pretty heavy set of We split this out because it's a pretty heavy set of
dependencies, and it's nice to not have to rebuild all of that every time. dependencies, and it's nice to not have to rebuild all of that every time.
The `sharelatex/sharelatex` image extends the base image and adds the actual Overleaf code The `sharelatex/sharelatex` image extends the base image and adds the actual Overleaf code
@ -67,19 +54,23 @@ and services.
Use `make build-base` and `make build-community` from `server-ce/` to build these images. Use `make build-base` and `make build-community` from `server-ce/` to build these images.
The [Phusion base-image](https://github.com/phusion/baseimage-docker) We use the [Phusion base-image](https://github.com/phusion/baseimage-docker)
(which is extended by the `base` image) provides a VM-like container (which is extended by our `base` image) to provide us with a VM-like container
in which to run the Overleaf services. Baseimage uses the `runit` service in which to run the Overleaf services. Baseimage uses the `runit` service
manager to manage services, and init scripts from the `server-ce/runit` manager to manage services, and we add our init-scripts from the `server-ce/runit`
folder are added. folder.
## Contributing
Please see the [CONTRIBUTING](CONTRIBUTING.md) file for information on contributing to the development of Overleaf.
## Authors ## Authors
[The Overleaf Team](https://www.overleaf.com/about) [The Overleaf Team](https://www.overleaf.com/about)
[yu-i-i](https://github.com/yu-i-i/overleaf-cep) — Extensions for CE unless otherwise noted
## License ## License
The code in this repository is released under the GNU AFFERO GENERAL PUBLIC LICENSE, version 3. A copy can be found in the [`LICENSE`](LICENSE) file. The code in this repository is released under the GNU AFFERO GENERAL PUBLIC LICENSE, version 3. A copy can be found in the [`LICENSE`](LICENSE) file.
Copyright (c) Overleaf, 2014-2025. Copyright (c) Overleaf, 2014-2024.

View file

@ -11,6 +11,12 @@ bin/build
> [!NOTE] > [!NOTE]
> If Docker is running out of RAM while building the services in parallel, create a `.env` file in this directory containing `COMPOSE_PARALLEL_LIMIT=1`. > If Docker is running out of RAM while building the services in parallel, create a `.env` file in this directory containing `COMPOSE_PARALLEL_LIMIT=1`.
Next, initialize the database:
```shell
bin/init
```
Then start the services: Then start the services:
```shell ```shell
@ -42,7 +48,7 @@ To do this, use the included `bin/dev` script:
bin/dev bin/dev
``` ```
This will start all services using `node --watch`, which will automatically monitor the code and restart the services as necessary. This will start all services using `nodemon`, which will automatically monitor the code and restart the services as necessary.
To improve performance, you can start only a subset of the services in development mode by providing a space-separated list to the `bin/dev` script: To improve performance, you can start only a subset of the services in development mode by providing a space-separated list to the `bin/dev` script:
@ -77,7 +83,6 @@ each service:
| `filestore` | 9235 | | `filestore` | 9235 |
| `notifications` | 9236 | | `notifications` | 9236 |
| `real-time` | 9237 | | `real-time` | 9237 |
| `references` | 9238 |
| `history-v1` | 9239 | | `history-v1` | 9239 |
| `project-history` | 9240 | | `project-history` | 9240 |

6
develop/bin/init Executable file
View file

@ -0,0 +1,6 @@
#!/usr/bin/env bash
docker compose up --detach mongo
curl --max-time 10 --retry 5 --retry-delay 5 --retry-all-errors --silent --output /dev/null localhost:27017
docker compose exec mongo mongosh --eval "rs.initiate({ _id: 'overleaf', members: [{ _id: 0, host: 'mongo:27017' }] })"
docker compose down mongo

View file

@ -6,18 +6,14 @@ DOCUMENT_UPDATER_HOST=document-updater
FILESTORE_HOST=filestore FILESTORE_HOST=filestore
GRACEFUL_SHUTDOWN_DELAY_SECONDS=0 GRACEFUL_SHUTDOWN_DELAY_SECONDS=0
HISTORY_V1_HOST=history-v1 HISTORY_V1_HOST=history-v1
HISTORY_REDIS_HOST=redis
LISTEN_ADDRESS=0.0.0.0 LISTEN_ADDRESS=0.0.0.0
MONGO_HOST=mongo MONGO_HOST=mongo
MONGO_URL=mongodb://mongo/sharelatex?directConnection=true MONGO_URL=mongodb://mongo/sharelatex?directConnection=true
NOTIFICATIONS_HOST=notifications NOTIFICATIONS_HOST=notifications
PROJECT_HISTORY_HOST=project-history PROJECT_HISTORY_HOST=project-history
QUEUES_REDIS_HOST=redis
REALTIME_HOST=real-time REALTIME_HOST=real-time
REDIS_HOST=redis REDIS_HOST=redis
REFERENCES_HOST=references
SESSION_SECRET=foo SESSION_SECRET=foo
V1_HISTORY_HOST=history-v1
WEBPACK_HOST=webpack WEBPACK_HOST=webpack
WEB_API_PASSWORD=overleaf WEB_API_PASSWORD=overleaf
WEB_API_USER=overleaf WEB_API_USER=overleaf

View file

@ -112,19 +112,8 @@ services:
- ../services/real-time/app.js:/overleaf/services/real-time/app.js - ../services/real-time/app.js:/overleaf/services/real-time/app.js
- ../services/real-time/config:/overleaf/services/real-time/config - ../services/real-time/config:/overleaf/services/real-time/config
references:
command: ["node", "--watch", "app.js"]
environment:
- NODE_OPTIONS=--inspect=0.0.0.0:9229
ports:
- "127.0.0.1:9238:9229"
volumes:
- ../services/references/app:/overleaf/services/references/app
- ../services/references/config:/overleaf/services/references/config
- ../services/references/app.js:/overleaf/services/references/app.js
web: web:
command: ["node", "--watch", "app.mjs", "--watch-locales"] command: ["node", "--watch", "app.js", "--watch-locales"]
environment: environment:
- NODE_OPTIONS=--inspect=0.0.0.0:9229 - NODE_OPTIONS=--inspect=0.0.0.0:9229
ports: ports:

View file

@ -1,5 +1,6 @@
volumes: volumes:
clsi-cache: clsi-cache:
clsi-output:
filestore-public-files: filestore-public-files:
filestore-template-files: filestore-template-files:
filestore-uploads: filestore-uploads:
@ -25,16 +26,15 @@ services:
env_file: env_file:
- dev.env - dev.env
environment: environment:
- DOCKER_RUNNER=true
- TEXLIVE_IMAGE=texlive-full # docker build texlive -t texlive-full - TEXLIVE_IMAGE=texlive-full # docker build texlive -t texlive-full
- SANDBOXED_COMPILES=true - COMPILES_HOST_DIR=${PWD}/compiles
- SANDBOXED_COMPILES_HOST_DIR_COMPILES=${PWD}/compiles
- SANDBOXED_COMPILES_HOST_DIR_OUTPUT=${PWD}/output
user: root user: root
volumes: volumes:
- ${PWD}/compiles:/overleaf/services/clsi/compiles - ${PWD}/compiles:/overleaf/services/clsi/compiles
- ${PWD}/output:/overleaf/services/clsi/output
- ${DOCKER_SOCKET_PATH:-/var/run/docker.sock}:/var/run/docker.sock - ${DOCKER_SOCKET_PATH:-/var/run/docker.sock}:/var/run/docker.sock
- clsi-cache:/overleaf/services/clsi/cache - clsi-cache:/overleaf/services/clsi/cache
- clsi-output:/overleaf/services/clsi/output
contacts: contacts:
build: build:
@ -88,20 +88,12 @@ services:
- history-v1-buckets:/buckets - history-v1-buckets:/buckets
mongo: mongo:
image: mongo:6.0 image: mongo:5
command: --replSet overleaf command: --replSet overleaf
ports: ports:
- "127.0.0.1:27017:27017" # for debugging - "127.0.0.1:27017:27017" # for debugging
volumes: volumes:
- mongo-data:/data/db - mongo-data:/data/db
- ../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
environment:
MONGO_INITDB_DATABASE: sharelatex
extra_hosts:
# Required when using the automatic database setup for initializing the
# replica set. This override is not needed when running the setup after
# starting up mongo.
- mongo:127.0.0.1
notifications: notifications:
build: build:
@ -123,7 +115,7 @@ services:
dockerfile: services/real-time/Dockerfile dockerfile: services/real-time/Dockerfile
env_file: env_file:
- dev.env - dev.env
redis: redis:
image: redis:5 image: redis:5
ports: ports:
@ -131,13 +123,6 @@ services:
volumes: volumes:
- redis-data:/data - redis-data:/data
references:
build:
context: ..
dockerfile: services/references/Dockerfile
env_file:
- dev.env
web: web:
build: build:
context: .. context: ..
@ -147,7 +132,7 @@ services:
- dev.env - dev.env
environment: environment:
- APP_NAME=Overleaf Community Edition - APP_NAME=Overleaf Community Edition
- ENABLED_LINKED_FILE_TYPES=project_file,project_output_file,url - ENABLED_LINKED_FILE_TYPES=project_file,project_output_file
- EMAIL_CONFIRMATION_DISABLED=true - EMAIL_CONFIRMATION_DISABLED=true
- NODE_ENV=development - NODE_ENV=development
- OVERLEAF_ALLOW_PUBLIC_ACCESS=true - OVERLEAF_ALLOW_PUBLIC_ACCESS=true
@ -168,7 +153,6 @@ services:
- notifications - notifications
- project-history - project-history
- real-time - real-time
- references
webpack: webpack:
build: build:

Binary file not shown.

Before

Width:  |  Height:  |  Size: 13 KiB

After

Width:  |  Height:  |  Size: 71 KiB

Before After
Before After

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1 MiB

After

Width:  |  Height:  |  Size: 587 KiB

Before After
Before After

View file

@ -32,7 +32,7 @@ services:
OVERLEAF_REDIS_HOST: redis OVERLEAF_REDIS_HOST: redis
REDIS_HOST: redis REDIS_HOST: redis
ENABLED_LINKED_FILE_TYPES: 'project_file,project_output_file,url' ENABLED_LINKED_FILE_TYPES: 'project_file,project_output_file'
# Enables Thumbnail generation using ImageMagick # Enables Thumbnail generation using ImageMagick
ENABLE_CONVERSIONS: 'true' ENABLE_CONVERSIONS: 'true'
@ -40,6 +40,10 @@ services:
# Disables email confirmation requirement # Disables email confirmation requirement
EMAIL_CONFIRMATION_DISABLED: 'true' EMAIL_CONFIRMATION_DISABLED: 'true'
# temporary fix for LuaLaTex compiles
# see https://github.com/overleaf/overleaf/issues/695
TEXMFVAR: /var/lib/overleaf/tmp/texmf-var
## Set for SSL via nginx-proxy ## Set for SSL via nginx-proxy
#VIRTUAL_HOST: 103.112.212.22 #VIRTUAL_HOST: 103.112.212.22
@ -73,19 +77,11 @@ services:
## Server Pro ## ## Server Pro ##
################ ################
## The Community Edition is intended for use in environments where all users are trusted and is not appropriate for ## Sandboxed Compiles: https://github.com/overleaf/overleaf/wiki/Server-Pro:-Sandboxed-Compiles
## scenarios where isolation of users is required. Sandboxed Compiles are not available in the Community Edition,
## so the following environment variables must be commented out to avoid compile issues.
##
## Sandboxed Compiles: https://docs.overleaf.com/on-premises/configuration/overleaf-toolkit/server-pro-only-configuration/sandboxed-compiles
SANDBOXED_COMPILES: 'true' SANDBOXED_COMPILES: 'true'
### Bind-mount source for /var/lib/overleaf/data/compiles inside the container.
SANDBOXED_COMPILES_HOST_DIR_COMPILES: '/home/user/sharelatex_data/data/compiles'
### Bind-mount source for /var/lib/overleaf/data/output inside the container.
SANDBOXED_COMPILES_HOST_DIR_OUTPUT: '/home/user/sharelatex_data/data/output'
### Backwards compatibility (before Server Pro 5.5)
DOCKER_RUNNER: 'true'
SANDBOXED_COMPILES_SIBLING_CONTAINERS: 'true' SANDBOXED_COMPILES_SIBLING_CONTAINERS: 'true'
### Bind-mount source for /var/lib/overleaf/data/compiles inside the container.
SANDBOXED_COMPILES_HOST_DIR: '/home/user/sharelatex_data/data/compiles'
## Works with test LDAP server shown at bottom of docker compose ## Works with test LDAP server shown at bottom of docker compose
# OVERLEAF_LDAP_URL: 'ldap://ldap:389' # OVERLEAF_LDAP_URL: 'ldap://ldap:389'
@ -106,12 +102,12 @@ services:
mongo: mongo:
restart: always restart: always
image: mongo:6.0 image: mongo:5.0
container_name: mongo container_name: mongo
command: '--replSet overleaf' command: '--replSet overleaf'
volumes: volumes:
- ~/mongo_data:/data/db - ~/mongo_data:/data/db
- ./bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js - ./mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
environment: environment:
MONGO_INITDB_DATABASE: sharelatex MONGO_INITDB_DATABASE: sharelatex
extra_hosts: extra_hosts:
@ -119,7 +115,7 @@ services:
# This override is not needed when running the setup after starting up mongo. # This override is not needed when running the setup after starting up mongo.
- mongo:127.0.0.1 - mongo:127.0.0.1
healthcheck: healthcheck:
test: echo 'db.stats().ok' | mongosh localhost:27017/test --quiet test: echo 'db.stats().ok' | mongo localhost:27017/test --quiet
interval: 10s interval: 10s
timeout: 10s timeout: 10s
retries: 5 retries: 5

View file

@ -0,0 +1 @@
node_modules/

View file

@ -0,0 +1,46 @@
compileFolder
Compiled source #
###################
*.com
*.class
*.dll
*.exe
*.o
*.so
# Packages #
############
# it's better to unpack these files and commit the raw source
# git has its own built in compression methods
*.7z
*.dmg
*.gz
*.iso
*.jar
*.rar
*.tar
*.zip
# Logs and databases #
######################
*.log
*.sql
*.sqlite
# OS generated files #
######################
.DS_Store?
ehthumbs.db
Icon?
Thumbs.db
/node_modules/*
data/*/*
**.swp
/log.json
hash_folder
.npmrc

View file

@ -1 +1 @@
22.17.0 20.18.0

View file

@ -1,10 +1,10 @@
access-token-encryptor access-token-encryptor
--dependencies=None --dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker --docker-repos=gcr.io/overleaf-ops
--env-add= --env-add=
--env-pass-through= --env-pass-through=
--esmock-loader=False --esmock-loader=False
--is-library=True --is-library=True
--node-version=22.17.0 --node-version=20.18.0
--public-repo=False --public-repo=False
--script-version=4.7.0 --script-version=4.5.0

View file

@ -21,7 +21,7 @@
"devDependencies": { "devDependencies": {
"chai": "^4.3.6", "chai": "^4.3.6",
"chai-as-promised": "^7.1.1", "chai-as-promised": "^7.1.1",
"mocha": "^11.1.0", "mocha": "^10.2.0",
"sandboxed-module": "^2.0.4", "sandboxed-module": "^2.0.4",
"typescript": "^5.0.4" "typescript": "^5.0.4"
} }

View file

@ -0,0 +1 @@
node_modules/

3
libraries/fetch-utils/.gitignore vendored Normal file
View file

@ -0,0 +1,3 @@
# managed by monorepo$ bin/update_build_scripts
.npmrc

View file

@ -1 +1 @@
22.17.0 20.18.0

View file

@ -1,10 +1,10 @@
fetch-utils fetch-utils
--dependencies=None --dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker --docker-repos=gcr.io/overleaf-ops
--env-add= --env-add=
--env-pass-through= --env-pass-through=
--esmock-loader=False --esmock-loader=False
--is-library=True --is-library=True
--node-version=22.17.0 --node-version=20.18.0
--public-repo=False --public-repo=False
--script-version=4.7.0 --script-version=4.5.0

View file

@ -23,11 +23,11 @@ async function fetchJson(url, opts = {}) {
} }
async function fetchJsonWithResponse(url, opts = {}) { async function fetchJsonWithResponse(url, opts = {}) {
const { fetchOpts, detachSignal } = parseOpts(opts) const { fetchOpts } = parseOpts(opts)
fetchOpts.headers = fetchOpts.headers ?? {} fetchOpts.headers = fetchOpts.headers ?? {}
fetchOpts.headers.Accept = fetchOpts.headers.Accept ?? 'application/json' fetchOpts.headers.Accept = fetchOpts.headers.Accept ?? 'application/json'
const response = await performRequest(url, fetchOpts, detachSignal) const response = await performRequest(url, fetchOpts)
if (!response.ok) { if (!response.ok) {
const body = await maybeGetResponseBody(response) const body = await maybeGetResponseBody(response)
throw new RequestFailedError(url, opts, response, body) throw new RequestFailedError(url, opts, response, body)
@ -53,8 +53,8 @@ async function fetchStream(url, opts = {}) {
} }
async function fetchStreamWithResponse(url, opts = {}) { async function fetchStreamWithResponse(url, opts = {}) {
const { fetchOpts, abortController, detachSignal } = parseOpts(opts) const { fetchOpts, abortController } = parseOpts(opts)
const response = await performRequest(url, fetchOpts, detachSignal) const response = await performRequest(url, fetchOpts)
if (!response.ok) { if (!response.ok) {
const body = await maybeGetResponseBody(response) const body = await maybeGetResponseBody(response)
@ -76,8 +76,8 @@ async function fetchStreamWithResponse(url, opts = {}) {
* @throws {RequestFailedError} if the response has a failure status code * @throws {RequestFailedError} if the response has a failure status code
*/ */
async function fetchNothing(url, opts = {}) { async function fetchNothing(url, opts = {}) {
const { fetchOpts, detachSignal } = parseOpts(opts) const { fetchOpts } = parseOpts(opts)
const response = await performRequest(url, fetchOpts, detachSignal) const response = await performRequest(url, fetchOpts)
if (!response.ok) { if (!response.ok) {
const body = await maybeGetResponseBody(response) const body = await maybeGetResponseBody(response)
throw new RequestFailedError(url, opts, response, body) throw new RequestFailedError(url, opts, response, body)
@ -95,22 +95,9 @@ async function fetchNothing(url, opts = {}) {
* @throws {RequestFailedError} if the response has a non redirect status code or missing Location header * @throws {RequestFailedError} if the response has a non redirect status code or missing Location header
*/ */
async function fetchRedirect(url, opts = {}) { async function fetchRedirect(url, opts = {}) {
const { location } = await fetchRedirectWithResponse(url, opts) const { fetchOpts } = parseOpts(opts)
return location
}
/**
* Make a request and extract the redirect from the response.
*
* @param {string | URL} url - request URL
* @param {object} opts - fetch options
* @return {Promise<{location: string, response: Response}>}
* @throws {RequestFailedError} if the response has a non redirect status code or missing Location header
*/
async function fetchRedirectWithResponse(url, opts = {}) {
const { fetchOpts, detachSignal } = parseOpts(opts)
fetchOpts.redirect = 'manual' fetchOpts.redirect = 'manual'
const response = await performRequest(url, fetchOpts, detachSignal) const response = await performRequest(url, fetchOpts)
if (response.status < 300 || response.status >= 400) { if (response.status < 300 || response.status >= 400) {
const body = await maybeGetResponseBody(response) const body = await maybeGetResponseBody(response)
throw new RequestFailedError(url, opts, response, body) throw new RequestFailedError(url, opts, response, body)
@ -125,7 +112,7 @@ async function fetchRedirectWithResponse(url, opts = {}) {
) )
} }
await discardResponseBody(response) await discardResponseBody(response)
return { location, response } return location
} }
/** /**
@ -142,8 +129,8 @@ async function fetchString(url, opts = {}) {
} }
async function fetchStringWithResponse(url, opts = {}) { async function fetchStringWithResponse(url, opts = {}) {
const { fetchOpts, detachSignal } = parseOpts(opts) const { fetchOpts } = parseOpts(opts)
const response = await performRequest(url, fetchOpts, detachSignal) const response = await performRequest(url, fetchOpts)
if (!response.ok) { if (!response.ok) {
const body = await maybeGetResponseBody(response) const body = await maybeGetResponseBody(response)
throw new RequestFailedError(url, opts, response, body) throw new RequestFailedError(url, opts, response, body)
@ -178,14 +165,13 @@ function parseOpts(opts) {
const abortController = new AbortController() const abortController = new AbortController()
fetchOpts.signal = abortController.signal fetchOpts.signal = abortController.signal
let detachSignal = () => {}
if (opts.signal) { if (opts.signal) {
detachSignal = abortOnSignal(abortController, opts.signal) abortOnSignal(abortController, opts.signal)
} }
if (opts.body instanceof Readable) { if (opts.body instanceof Readable) {
abortOnDestroyedRequest(abortController, fetchOpts.body) abortOnDestroyedRequest(abortController, fetchOpts.body)
} }
return { fetchOpts, abortController, detachSignal } return { fetchOpts, abortController }
} }
function setupJsonBody(fetchOpts, json) { function setupJsonBody(fetchOpts, json) {
@ -209,9 +195,6 @@ function abortOnSignal(abortController, signal) {
abortController.abort(signal.reason) abortController.abort(signal.reason)
} }
signal.addEventListener('abort', listener) signal.addEventListener('abort', listener)
return () => {
signal.removeEventListener('abort', listener)
}
} }
function abortOnDestroyedRequest(abortController, stream) { function abortOnDestroyedRequest(abortController, stream) {
@ -230,12 +213,11 @@ function abortOnDestroyedResponse(abortController, response) {
}) })
} }
async function performRequest(url, fetchOpts, detachSignal) { async function performRequest(url, fetchOpts) {
let response let response
try { try {
response = await fetch(url, fetchOpts) response = await fetch(url, fetchOpts)
} catch (err) { } catch (err) {
detachSignal()
if (fetchOpts.body instanceof Readable) { if (fetchOpts.body instanceof Readable) {
fetchOpts.body.destroy() fetchOpts.body.destroy()
} }
@ -244,7 +226,6 @@ async function performRequest(url, fetchOpts, detachSignal) {
method: fetchOpts.method ?? 'GET', method: fetchOpts.method ?? 'GET',
}) })
} }
response.body.on('close', detachSignal)
if (fetchOpts.body instanceof Readable) { if (fetchOpts.body instanceof Readable) {
response.body.on('close', () => { response.body.on('close', () => {
if (!fetchOpts.body.readableEnded) { if (!fetchOpts.body.readableEnded) {
@ -316,7 +297,6 @@ module.exports = {
fetchStreamWithResponse, fetchStreamWithResponse,
fetchNothing, fetchNothing,
fetchRedirect, fetchRedirect,
fetchRedirectWithResponse,
fetchString, fetchString,
fetchStringWithResponse, fetchStringWithResponse,
RequestFailedError, RequestFailedError,

View file

@ -20,8 +20,8 @@
"body-parser": "^1.20.3", "body-parser": "^1.20.3",
"chai": "^4.3.6", "chai": "^4.3.6",
"chai-as-promised": "^7.1.1", "chai-as-promised": "^7.1.1",
"express": "^4.21.2", "express": "^4.21.0",
"mocha": "^11.1.0", "mocha": "^10.2.0",
"typescript": "^5.0.4" "typescript": "^5.0.4"
}, },
"dependencies": { "dependencies": {

View file

@ -1,9 +1,6 @@
const { expect } = require('chai') const { expect } = require('chai')
const fs = require('node:fs')
const events = require('node:events')
const { FetchError, AbortError } = require('node-fetch') const { FetchError, AbortError } = require('node-fetch')
const { Readable } = require('node:stream') const { Readable } = require('node:stream')
const { pipeline } = require('node:stream/promises')
const { once } = require('node:events') const { once } = require('node:events')
const { TestServer } = require('./helpers/TestServer') const { TestServer } = require('./helpers/TestServer')
const selfsigned = require('selfsigned') const selfsigned = require('selfsigned')
@ -206,31 +203,6 @@ describe('fetch-utils', function () {
).to.be.rejectedWith(AbortError) ).to.be.rejectedWith(AbortError)
expect(stream.destroyed).to.be.true expect(stream.destroyed).to.be.true
}) })
it('detaches from signal on success', async function () {
const signal = AbortSignal.timeout(10_000)
for (let i = 0; i < 20; i++) {
const s = await fetchStream(this.url('/hello'), { signal })
expect(events.getEventListeners(signal, 'abort')).to.have.length(1)
await pipeline(s, fs.createWriteStream('/dev/null'))
expect(events.getEventListeners(signal, 'abort')).to.have.length(0)
}
})
it('detaches from signal on error', async function () {
const signal = AbortSignal.timeout(10_000)
for (let i = 0; i < 20; i++) {
try {
await fetchStream(this.url('/500'), { signal })
} catch (err) {
if (err instanceof RequestFailedError && err.response.status === 500)
continue
throw err
} finally {
expect(events.getEventListeners(signal, 'abort')).to.have.length(0)
}
}
})
}) })
describe('fetchNothing', function () { describe('fetchNothing', function () {
@ -419,16 +391,9 @@ async function* infiniteIterator() {
async function abortOnceReceived(func, server) { async function abortOnceReceived(func, server) {
const controller = new AbortController() const controller = new AbortController()
const promise = func(controller.signal) const promise = func(controller.signal)
expect(events.getEventListeners(controller.signal, 'abort')).to.have.length(1)
await once(server.events, 'request-received') await once(server.events, 'request-received')
controller.abort() controller.abort()
try { return await promise
return await promise
} finally {
expect(events.getEventListeners(controller.signal, 'abort')).to.have.length(
0
)
}
} }
async function expectRequestAborted(req) { async function expectRequestAborted(req) {

View file

@ -0,0 +1 @@
node_modules/

3
libraries/logger/.gitignore vendored Normal file
View file

@ -0,0 +1,3 @@
node_modules
.npmrc

View file

@ -1 +1 @@
22.17.0 20.18.0

View file

@ -1,10 +1,10 @@
logger logger
--dependencies=None --dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker --docker-repos=gcr.io/overleaf-ops
--env-add= --env-add=
--env-pass-through= --env-pass-through=
--esmock-loader=False --esmock-loader=False
--is-library=True --is-library=True
--node-version=22.17.0 --node-version=20.18.0
--public-repo=False --public-repo=False
--script-version=4.7.0 --script-version=4.5.0

View file

@ -14,10 +14,8 @@ const LoggingManager = {
initialize(name) { initialize(name) {
this.isProduction = this.isProduction =
(process.env.NODE_ENV || '').toLowerCase() === 'production' (process.env.NODE_ENV || '').toLowerCase() === 'production'
const isTest = (process.env.NODE_ENV || '').toLowerCase() === 'test'
this.defaultLevel = this.defaultLevel =
process.env.LOG_LEVEL || process.env.LOG_LEVEL || (this.isProduction ? 'info' : 'debug')
(this.isProduction ? 'info' : isTest ? 'fatal' : 'debug')
this.loggerName = name this.loggerName = name
this.logger = bunyan.createLogger({ this.logger = bunyan.createLogger({
name, name,

View file

@ -27,7 +27,7 @@
}, },
"devDependencies": { "devDependencies": {
"chai": "^4.3.6", "chai": "^4.3.6",
"mocha": "^11.1.0", "mocha": "^10.2.0",
"sandboxed-module": "^2.0.4", "sandboxed-module": "^2.0.4",
"sinon": "^9.2.4", "sinon": "^9.2.4",
"sinon-chai": "^3.7.0", "sinon-chai": "^3.7.0",

View file

@ -0,0 +1 @@
node_modules/

3
libraries/metrics/.gitignore vendored Normal file
View file

@ -0,0 +1,3 @@
node_modules
.npmrc

View file

@ -1 +1 @@
22.17.0 20.18.0

View file

@ -1,10 +1,10 @@
metrics metrics
--dependencies=None --dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker --docker-repos=gcr.io/overleaf-ops
--env-add= --env-add=
--env-pass-through= --env-pass-through=
--esmock-loader=False --esmock-loader=False
--is-library=True --is-library=True
--node-version=22.17.0 --node-version=20.18.0
--public-repo=False --public-repo=False
--script-version=4.7.0 --script-version=4.5.0

View file

@ -5,8 +5,6 @@
* before any other module to support code instrumentation. * before any other module to support code instrumentation.
*/ */
const metricsModuleImportStartTime = performance.now()
const APP_NAME = process.env.METRICS_APP_NAME || 'unknown' const APP_NAME = process.env.METRICS_APP_NAME || 'unknown'
const BUILD_VERSION = process.env.BUILD_VERSION const BUILD_VERSION = process.env.BUILD_VERSION
const ENABLE_PROFILE_AGENT = process.env.ENABLE_PROFILE_AGENT === 'true' const ENABLE_PROFILE_AGENT = process.env.ENABLE_PROFILE_AGENT === 'true'
@ -105,5 +103,3 @@ function recordProcessStart() {
const metrics = require('.') const metrics = require('.')
metrics.inc('process_startup') metrics.inc('process_startup')
} }
module.exports = { metricsModuleImportStartTime }

View file

@ -9,7 +9,7 @@
"main": "index.js", "main": "index.js",
"dependencies": { "dependencies": {
"@google-cloud/opentelemetry-cloud-trace-exporter": "^2.1.0", "@google-cloud/opentelemetry-cloud-trace-exporter": "^2.1.0",
"@google-cloud/profiler": "^6.0.3", "@google-cloud/profiler": "^6.0.0",
"@opentelemetry/api": "^1.4.1", "@opentelemetry/api": "^1.4.1",
"@opentelemetry/auto-instrumentations-node": "^0.39.1", "@opentelemetry/auto-instrumentations-node": "^0.39.1",
"@opentelemetry/exporter-trace-otlp-http": "^0.41.2", "@opentelemetry/exporter-trace-otlp-http": "^0.41.2",
@ -23,7 +23,7 @@
"devDependencies": { "devDependencies": {
"bunyan": "^1.0.0", "bunyan": "^1.0.0",
"chai": "^4.3.6", "chai": "^4.3.6",
"mocha": "^11.1.0", "mocha": "^10.2.0",
"sandboxed-module": "^2.0.4", "sandboxed-module": "^2.0.4",
"sinon": "^9.2.4", "sinon": "^9.2.4",
"typescript": "^5.0.4" "typescript": "^5.0.4"

View file

@ -0,0 +1 @@
node_modules/

3
libraries/mongo-utils/.gitignore vendored Normal file
View file

@ -0,0 +1,3 @@
# managed by monorepo$ bin/update_build_scripts
.npmrc

View file

@ -1 +1 @@
22.17.0 20.18.0

View file

@ -16,7 +16,6 @@ let VERBOSE_LOGGING
let BATCH_RANGE_START let BATCH_RANGE_START
let BATCH_RANGE_END let BATCH_RANGE_END
let BATCH_MAX_TIME_SPAN_IN_MS let BATCH_MAX_TIME_SPAN_IN_MS
let BATCHED_UPDATE_RUNNING = false
/** /**
* @typedef {import("mongodb").Collection} Collection * @typedef {import("mongodb").Collection} Collection
@ -35,7 +34,6 @@ let BATCHED_UPDATE_RUNNING = false
* @property {string} [BATCH_RANGE_START] * @property {string} [BATCH_RANGE_START]
* @property {string} [BATCH_SIZE] * @property {string} [BATCH_SIZE]
* @property {string} [VERBOSE_LOGGING] * @property {string} [VERBOSE_LOGGING]
* @property {(progress: string) => Promise<void>} [trackProgress]
*/ */
/** /**
@ -211,71 +209,59 @@ async function batchedUpdate(
update, update,
projection, projection,
findOptions, findOptions,
batchedUpdateOptions = {} batchedUpdateOptions
) { ) {
// only a single batchedUpdate can run at a time due to global variables ID_EDGE_PAST = await getIdEdgePast(collection)
if (BATCHED_UPDATE_RUNNING) { if (!ID_EDGE_PAST) {
throw new Error('batchedUpdate is already running') console.warn(
`The collection ${collection.collectionName} appears to be empty.`
)
return 0
} }
try { refreshGlobalOptionsForBatchedUpdate(batchedUpdateOptions)
BATCHED_UPDATE_RUNNING = true
ID_EDGE_PAST = await getIdEdgePast(collection)
if (!ID_EDGE_PAST) {
console.warn(
`The collection ${collection.collectionName} appears to be empty.`
)
return 0
}
refreshGlobalOptionsForBatchedUpdate(batchedUpdateOptions)
const { trackProgress = async progress => console.warn(progress) } =
batchedUpdateOptions
findOptions = findOptions || {} findOptions = findOptions || {}
findOptions.readPreference = READ_PREFERENCE_SECONDARY findOptions.readPreference = READ_PREFERENCE_SECONDARY
projection = projection || { _id: 1 } projection = projection || { _id: 1 }
let nextBatch let nextBatch
let updated = 0 let updated = 0
let start = BATCH_RANGE_START let start = BATCH_RANGE_START
while (start !== BATCH_RANGE_END) { while (start !== BATCH_RANGE_END) {
let end = getNextEnd(start) let end = getNextEnd(start)
nextBatch = await getNextBatch( nextBatch = await getNextBatch(
collection, collection,
query, query,
start, start,
end, end,
projection, projection,
findOptions findOptions
) )
if (nextBatch.length > 0) { if (nextBatch.length > 0) {
end = nextBatch[nextBatch.length - 1]._id end = nextBatch[nextBatch.length - 1]._id
updated += nextBatch.length updated += nextBatch.length
if (VERBOSE_LOGGING) { if (VERBOSE_LOGGING) {
console.log( console.log(
`Running update on batch with ids ${JSON.stringify( `Running update on batch with ids ${JSON.stringify(
nextBatch.map(entry => entry._id) nextBatch.map(entry => entry._id)
)}` )}`
)
}
await trackProgress(
`Running update on batch ending ${renderObjectId(end)}`
) )
} else {
if (typeof update === 'function') { console.error(`Running update on batch ending ${renderObjectId(end)}`)
await update(nextBatch) }
} else {
await performUpdate(collection, nextBatch, update) if (typeof update === 'function') {
} await update(nextBatch)
} else {
await performUpdate(collection, nextBatch, update)
} }
await trackProgress(`Completed batch ending ${renderObjectId(end)}`)
start = end
} }
return updated console.error(`Completed batch ending ${renderObjectId(end)}`)
} finally { start = end
BATCHED_UPDATE_RUNNING = false
} }
return updated
} }
/** /**

View file

@ -1,10 +1,10 @@
mongo-utils mongo-utils
--dependencies=None --dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker --docker-repos=gcr.io/overleaf-ops
--env-add= --env-add=
--env-pass-through= --env-pass-through=
--esmock-loader=False --esmock-loader=False
--is-library=True --is-library=True
--node-version=22.17.0 --node-version=20.18.0
--public-repo=False --public-repo=False
--script-version=4.7.0 --script-version=4.5.0

View file

@ -16,12 +16,12 @@
"author": "Overleaf (https://www.overleaf.com)", "author": "Overleaf (https://www.overleaf.com)",
"license": "AGPL-3.0-only", "license": "AGPL-3.0-only",
"dependencies": { "dependencies": {
"mongodb": "6.12.0", "mongodb": "6.10.0",
"mongodb-legacy": "6.1.3" "mongodb-legacy": "6.1.3"
}, },
"devDependencies": { "devDependencies": {
"chai": "^4.3.6", "chai": "^4.3.6",
"mocha": "^11.1.0", "mocha": "^10.2.0",
"sandboxed-module": "^2.0.4", "sandboxed-module": "^2.0.4",
"sinon": "^9.2.4", "sinon": "^9.2.4",
"sinon-chai": "^3.7.0", "sinon-chai": "^3.7.0",

View file

@ -0,0 +1 @@
node_modules/

5
libraries/o-error/.gitignore vendored Normal file
View file

@ -0,0 +1,5 @@
.nyc_output
coverage
node_modules/
.npmrc

View file

@ -1 +1 @@
22.17.0 20.18.0

View file

@ -1,10 +1,10 @@
o-error o-error
--dependencies=None --dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker --docker-repos=gcr.io/overleaf-ops
--env-add= --env-add=
--env-pass-through= --env-pass-through=
--esmock-loader=False --esmock-loader=False
--is-library=True --is-library=True
--node-version=22.17.0 --node-version=20.18.0
--public-repo=False --public-repo=False
--script-version=4.7.0 --script-version=4.5.0

View file

@ -1,34 +1,20 @@
// @ts-check
/** /**
* Light-weight helpers for handling JavaScript Errors in node.js and the * Light-weight helpers for handling JavaScript Errors in node.js and the
* browser. * browser.
*/ */
class OError extends Error { class OError extends Error {
/**
* The error that is the underlying cause of this error
*
* @type {unknown}
*/
cause
/**
* List of errors encountered as the callback chain is unwound
*
* @type {TaggedError[] | undefined}
*/
_oErrorTags
/** /**
* @param {string} message as for built-in Error * @param {string} message as for built-in Error
* @param {Object} [info] extra data to attach to the error * @param {Object} [info] extra data to attach to the error
* @param {unknown} [cause] the internal error that caused this error * @param {Error} [cause] the internal error that caused this error
*/ */
constructor(message, info, cause) { constructor(message, info, cause) {
super(message) super(message)
this.name = this.constructor.name this.name = this.constructor.name
if (info) this.info = info if (info) this.info = info
if (cause) this.cause = cause if (cause) this.cause = cause
/** @private @type {Array<TaggedError> | undefined} */
this._oErrorTags // eslint-disable-line
} }
/** /**
@ -45,7 +31,7 @@ class OError extends Error {
/** /**
* Wrap the given error, which caused this error. * Wrap the given error, which caused this error.
* *
* @param {unknown} cause the internal error that caused this error * @param {Error} cause the internal error that caused this error
* @return {this} * @return {this}
*/ */
withCause(cause) { withCause(cause) {
@ -79,16 +65,13 @@ class OError extends Error {
* } * }
* } * }
* *
* @template {unknown} E * @param {Error} error the error to tag
* @param {E} error the error to tag
* @param {string} [message] message with which to tag `error` * @param {string} [message] message with which to tag `error`
* @param {Object} [info] extra data with wich to tag `error` * @param {Object} [info] extra data with wich to tag `error`
* @return {E} the modified `error` argument * @return {Error} the modified `error` argument
*/ */
static tag(error, message, info) { static tag(error, message, info) {
const oError = /** @type {{ _oErrorTags: TaggedError[] | undefined }} */ ( const oError = /** @type{OError} */ (error)
error
)
if (!oError._oErrorTags) oError._oErrorTags = [] if (!oError._oErrorTags) oError._oErrorTags = []
@ -119,7 +102,7 @@ class OError extends Error {
* *
* If an info property is repeated, the last one wins. * If an info property is repeated, the last one wins.
* *
* @param {unknown} error any error (may or may not be an `OError`) * @param {Error | null | undefined} error any error (may or may not be an `OError`)
* @return {Object} * @return {Object}
*/ */
static getFullInfo(error) { static getFullInfo(error) {
@ -146,7 +129,7 @@ class OError extends Error {
* Return the `stack` property from `error`, including the `stack`s for any * Return the `stack` property from `error`, including the `stack`s for any
* tagged errors added with `OError.tag` and for any `cause`s. * tagged errors added with `OError.tag` and for any `cause`s.
* *
* @param {unknown} error any error (may or may not be an `OError`) * @param {Error | null | undefined} error any error (may or may not be an `OError`)
* @return {string} * @return {string}
*/ */
static getFullStack(error) { static getFullStack(error) {
@ -160,7 +143,7 @@ class OError extends Error {
stack += `\n${oError._oErrorTags.map(tag => tag.stack).join('\n')}` stack += `\n${oError._oErrorTags.map(tag => tag.stack).join('\n')}`
} }
const causeStack = OError.getFullStack(oError.cause) const causeStack = oError.cause && OError.getFullStack(oError.cause)
if (causeStack) { if (causeStack) {
stack += '\ncaused by:\n' + indent(causeStack) stack += '\ncaused by:\n' + indent(causeStack)
} }

View file

@ -34,7 +34,7 @@
"@types/chai": "^4.3.0", "@types/chai": "^4.3.0",
"@types/node": "^18.17.4", "@types/node": "^18.17.4",
"chai": "^4.3.6", "chai": "^4.3.6",
"mocha": "^11.1.0", "mocha": "^10.2.0",
"typescript": "^5.0.4" "typescript": "^5.0.4"
} }
} }

View file

@ -268,11 +268,6 @@ describe('utils', function () {
expect(OError.getFullInfo(null)).to.deep.equal({}) expect(OError.getFullInfo(null)).to.deep.equal({})
}) })
it('works when given a string', function () {
const err = 'not an error instance'
expect(OError.getFullInfo(err)).to.deep.equal({})
})
it('works on a normal error', function () { it('works on a normal error', function () {
const err = new Error('foo') const err = new Error('foo')
expect(OError.getFullInfo(err)).to.deep.equal({}) expect(OError.getFullInfo(err)).to.deep.equal({})

View file

@ -35,14 +35,6 @@ describe('OError', function () {
expect(err2.cause.message).to.equal('cause 2') expect(err2.cause.message).to.equal('cause 2')
}) })
it('accepts non-Error causes', function () {
const err1 = new OError('foo', {}, 'not-an-error')
expect(err1.cause).to.equal('not-an-error')
const err2 = new OError('foo').withCause('not-an-error')
expect(err2.cause).to.equal('not-an-error')
})
it('handles a custom error type with a cause', function () { it('handles a custom error type with a cause', function () {
function doSomethingBadInternally() { function doSomethingBadInternally() {
throw new Error('internal error') throw new Error('internal error')

View file

@ -0,0 +1 @@
node_modules/

4
libraries/object-persistor/.gitignore vendored Normal file
View file

@ -0,0 +1,4 @@
/node_modules
*.swp
.npmrc

View file

@ -1 +1 @@
22.17.0 20.18.0

View file

@ -1,10 +1,10 @@
object-persistor object-persistor
--dependencies=None --dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker --docker-repos=gcr.io/overleaf-ops
--env-add= --env-add=
--env-pass-through= --env-pass-through=
--esmock-loader=False --esmock-loader=False
--is-library=True --is-library=True
--node-version=22.17.0 --node-version=20.18.0
--public-repo=False --public-repo=False
--script-version=4.7.0 --script-version=4.5.0

View file

@ -34,9 +34,9 @@
"devDependencies": { "devDependencies": {
"chai": "^4.3.6", "chai": "^4.3.6",
"chai-as-promised": "^7.1.1", "chai-as-promised": "^7.1.1",
"mocha": "^11.1.0", "mocha": "^10.2.0",
"mock-fs": "^5.2.0", "mock-fs": "^5.2.0",
"mongodb": "6.12.0", "mongodb": "6.10.0",
"sandboxed-module": "^2.0.4", "sandboxed-module": "^2.0.4",
"sinon": "^9.2.4", "sinon": "^9.2.4",
"sinon-chai": "^3.7.0", "sinon-chai": "^3.7.0",

View file

@ -305,10 +305,8 @@ module.exports = class FSPersistor extends AbstractPersistor {
async _listDirectory(path) { async _listDirectory(path) {
if (this.useSubdirectories) { if (this.useSubdirectories) {
// eslint-disable-next-line @typescript-eslint/return-await
return await glob(Path.join(path, '**')) return await glob(Path.join(path, '**'))
} else { } else {
// eslint-disable-next-line @typescript-eslint/return-await
return await glob(`${path}_*`) return await glob(`${path}_*`)
} }
} }

View file

@ -33,10 +33,6 @@ const AES256_KEY_LENGTH = 32
* @property {() => Promise<Array<RootKeyEncryptionKey>>} getRootKeyEncryptionKeys * @property {() => Promise<Array<RootKeyEncryptionKey>>} getRootKeyEncryptionKeys
*/ */
/**
* @typedef {import('./types').ListDirectoryResult} ListDirectoryResult
*/
/** /**
* Helper function to make TS happy when accessing error properties * Helper function to make TS happy when accessing error properties
* AWSError is not an actual class, so we cannot use instanceof. * AWSError is not an actual class, so we cannot use instanceof.
@ -347,10 +343,9 @@ class PerProjectEncryptedS3Persistor extends S3Persistor {
} }
async deleteDirectory(bucketName, path, continuationToken) { async deleteDirectory(bucketName, path, continuationToken) {
// Let [Settings.pathToProjectFolder] validate the project path before deleting things.
const { projectFolder, dekPath } = this.#buildProjectPaths(bucketName, path)
// Note: Listing/Deleting a prefix does not require SSE-C credentials. // Note: Listing/Deleting a prefix does not require SSE-C credentials.
await super.deleteDirectory(bucketName, path, continuationToken) await super.deleteDirectory(bucketName, path, continuationToken)
const { projectFolder, dekPath } = this.#buildProjectPaths(bucketName, path)
if (projectFolder === path) { if (projectFolder === path) {
await super.deleteObject( await super.deleteObject(
this.#settings.dataEncryptionKeyBucketName, this.#settings.dataEncryptionKeyBucketName,
@ -395,9 +390,9 @@ class PerProjectEncryptedS3Persistor extends S3Persistor {
* A general "cache" for project keys is another alternative. For now, use a helper class. * A general "cache" for project keys is another alternative. For now, use a helper class.
*/ */
class CachedPerProjectEncryptedS3Persistor { class CachedPerProjectEncryptedS3Persistor {
/** @type SSECOptions */ /** @type SSECOptions */
#projectKeyOptions #projectKeyOptions
/** @type PerProjectEncryptedS3Persistor */ /** @type PerProjectEncryptedS3Persistor */
#parent #parent
/** /**
@ -418,26 +413,6 @@ class CachedPerProjectEncryptedS3Persistor {
return await this.sendStream(bucketName, path, fs.createReadStream(fsPath)) return await this.sendStream(bucketName, path, fs.createReadStream(fsPath))
} }
/**
*
* @param {string} bucketName
* @param {string} path
* @return {Promise<number>}
*/
async getObjectSize(bucketName, path) {
return await this.#parent.getObjectSize(bucketName, path)
}
/**
*
* @param {string} bucketName
* @param {string} path
* @return {Promise<ListDirectoryResult>}
*/
async listDirectory(bucketName, path) {
return await this.#parent.listDirectory(bucketName, path)
}
/** /**
* @param {string} bucketName * @param {string} bucketName
* @param {string} path * @param {string} path

View file

@ -20,18 +20,6 @@ const { URL } = require('node:url')
const { WriteError, ReadError, NotFoundError } = require('./Errors') const { WriteError, ReadError, NotFoundError } = require('./Errors')
const zlib = require('node:zlib') const zlib = require('node:zlib')
/**
* @typedef {import('aws-sdk/clients/s3').ListObjectsV2Output} ListObjectsV2Output
*/
/**
* @typedef {import('aws-sdk/clients/s3').Object} S3Object
*/
/**
* @typedef {import('./types').ListDirectoryResult} ListDirectoryResult
*/
/** /**
* Wrapper with private fields to avoid revealing them on console, JSON.stringify or similar. * Wrapper with private fields to avoid revealing them on console, JSON.stringify or similar.
*/ */
@ -278,12 +266,26 @@ class S3Persistor extends AbstractPersistor {
* @return {Promise<void>} * @return {Promise<void>}
*/ */
async deleteDirectory(bucketName, key, continuationToken) { async deleteDirectory(bucketName, key, continuationToken) {
const { contents, response } = await this.listDirectory( let response
bucketName, const options = { Bucket: bucketName, Prefix: key }
key, if (continuationToken) {
continuationToken options.ContinuationToken = continuationToken
) }
const objects = contents.map(item => ({ Key: item.Key || '' }))
try {
response = await this._getClientForBucket(bucketName)
.listObjectsV2(options)
.promise()
} catch (err) {
throw PersistorHelper.wrapError(
err,
'failed to list objects in S3',
{ bucketName, key },
ReadError
)
}
const objects = response.Contents?.map(item => ({ Key: item.Key || '' }))
if (objects?.length) { if (objects?.length) {
try { try {
await this._getClientForBucket(bucketName) await this._getClientForBucket(bucketName)
@ -314,36 +316,6 @@ class S3Persistor extends AbstractPersistor {
} }
} }
/**
*
* @param {string} bucketName
* @param {string} key
* @param {string} [continuationToken]
* @return {Promise<ListDirectoryResult>}
*/
async listDirectory(bucketName, key, continuationToken) {
let response
const options = { Bucket: bucketName, Prefix: key }
if (continuationToken) {
options.ContinuationToken = continuationToken
}
try {
response = await this._getClientForBucket(bucketName)
.listObjectsV2(options)
.promise()
} catch (err) {
throw PersistorHelper.wrapError(
err,
'failed to list objects in S3',
{ bucketName, key },
ReadError
)
}
return { contents: response.Contents ?? [], response }
}
/** /**
* @param {string} bucketName * @param {string} bucketName
* @param {string} key * @param {string} key

View file

@ -1,6 +0,0 @@
import type { ListObjectsV2Output, Object } from 'aws-sdk/clients/s3'
export type ListDirectoryResult = {
contents: Array<Object>
response: ListObjectsV2Output
}

View file

@ -0,0 +1 @@
node_modules/

View file

@ -0,0 +1,5 @@
/coverage
/node_modules
# managed by monorepo$ bin/update_build_scripts
.npmrc

View file

@ -1 +1 @@
22.17.0 20.18.0

View file

@ -1,10 +1,10 @@
overleaf-editor-core overleaf-editor-core
--dependencies=None --dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker --docker-repos=gcr.io/overleaf-ops
--env-add= --env-add=
--env-pass-through= --env-pass-through=
--esmock-loader=False --esmock-loader=False
--is-library=True --is-library=True
--node-version=22.17.0 --node-version=20.18.0
--public-repo=False --public-repo=False
--script-version=4.7.0 --script-version=4.5.0

View file

@ -18,7 +18,6 @@ const MoveFileOperation = require('./lib/operation/move_file_operation')
const SetCommentStateOperation = require('./lib/operation/set_comment_state_operation') const SetCommentStateOperation = require('./lib/operation/set_comment_state_operation')
const EditFileOperation = require('./lib/operation/edit_file_operation') const EditFileOperation = require('./lib/operation/edit_file_operation')
const EditNoOperation = require('./lib/operation/edit_no_operation') const EditNoOperation = require('./lib/operation/edit_no_operation')
const EditOperationTransformer = require('./lib/operation/edit_operation_transformer')
const SetFileMetadataOperation = require('./lib/operation/set_file_metadata_operation') const SetFileMetadataOperation = require('./lib/operation/set_file_metadata_operation')
const NoOperation = require('./lib/operation/no_operation') const NoOperation = require('./lib/operation/no_operation')
const Operation = require('./lib/operation') const Operation = require('./lib/operation')
@ -44,8 +43,6 @@ const TrackingProps = require('./lib/file_data/tracking_props')
const Range = require('./lib/range') const Range = require('./lib/range')
const CommentList = require('./lib/file_data/comment_list') const CommentList = require('./lib/file_data/comment_list')
const LazyStringFileData = require('./lib/file_data/lazy_string_file_data') const LazyStringFileData = require('./lib/file_data/lazy_string_file_data')
const StringFileData = require('./lib/file_data/string_file_data')
const EditOperationBuilder = require('./lib/operation/edit_operation_builder')
exports.AddCommentOperation = AddCommentOperation exports.AddCommentOperation = AddCommentOperation
exports.Author = Author exports.Author = Author
@ -61,7 +58,6 @@ exports.DeleteCommentOperation = DeleteCommentOperation
exports.File = File exports.File = File
exports.FileMap = FileMap exports.FileMap = FileMap
exports.LazyStringFileData = LazyStringFileData exports.LazyStringFileData = LazyStringFileData
exports.StringFileData = StringFileData
exports.History = History exports.History = History
exports.Label = Label exports.Label = Label
exports.AddFileOperation = AddFileOperation exports.AddFileOperation = AddFileOperation
@ -69,8 +65,6 @@ exports.MoveFileOperation = MoveFileOperation
exports.SetCommentStateOperation = SetCommentStateOperation exports.SetCommentStateOperation = SetCommentStateOperation
exports.EditFileOperation = EditFileOperation exports.EditFileOperation = EditFileOperation
exports.EditNoOperation = EditNoOperation exports.EditNoOperation = EditNoOperation
exports.EditOperationBuilder = EditOperationBuilder
exports.EditOperationTransformer = EditOperationTransformer
exports.SetFileMetadataOperation = SetFileMetadataOperation exports.SetFileMetadataOperation = SetFileMetadataOperation
exports.NoOperation = NoOperation exports.NoOperation = NoOperation
exports.Operation = Operation exports.Operation = Operation

View file

@ -13,7 +13,7 @@ const V2DocVersions = require('./v2_doc_versions')
/** /**
* @import Author from "./author" * @import Author from "./author"
* @import { BlobStore, RawChange, ReadonlyBlobStore } from "./types" * @import { BlobStore } from "./types"
*/ */
/** /**
@ -54,7 +54,7 @@ class Change {
/** /**
* For serialization. * For serialization.
* *
* @return {RawChange} * @return {Object}
*/ */
toRaw() { toRaw() {
function toRaw(object) { function toRaw(object) {
@ -100,9 +100,6 @@ class Change {
) )
} }
/**
* @return {Operation[]}
*/
getOperations() { getOperations() {
return this.operations return this.operations
} }
@ -219,7 +216,7 @@ class Change {
* If this Change contains any File objects, load them. * If this Change contains any File objects, load them.
* *
* @param {string} kind see {File#load} * @param {string} kind see {File#load}
* @param {ReadonlyBlobStore} blobStore * @param {BlobStore} blobStore
* @return {Promise<void>} * @return {Promise<void>}
*/ */
async loadFiles(kind, blobStore) { async loadFiles(kind, blobStore) {
@ -251,24 +248,6 @@ class Change {
* @param {boolean} [opts.strict] - Do not ignore recoverable errors * @param {boolean} [opts.strict] - Do not ignore recoverable errors
*/ */
applyTo(snapshot, opts = {}) { applyTo(snapshot, opts = {}) {
// eslint-disable-next-line no-unused-vars
for (const operation of this.iterativelyApplyTo(snapshot, opts)) {
// Nothing to do: we're just consuming the iterator for the side effects
}
}
/**
* Generator that applies this change to a snapshot and yields each
* operation after it has been applied.
*
* Recoverable errors (caused by historical bad data) are ignored unless
* opts.strict is true
*
* @param {Snapshot} snapshot modified in place
* @param {object} opts
* @param {boolean} [opts.strict] - Do not ignore recoverable errors
*/
*iterativelyApplyTo(snapshot, opts = {}) {
assert.object(snapshot, 'bad snapshot') assert.object(snapshot, 'bad snapshot')
for (const operation of this.operations) { for (const operation of this.operations) {
@ -282,7 +261,6 @@ class Change {
throw err throw err
} }
} }
yield operation
} }
// update project version if present in change // update project version if present in change

View file

@ -1,7 +1,7 @@
// @ts-check // @ts-check
/** /**
* @import { ClearTrackingPropsRawData, TrackingDirective } from '../types' * @import { ClearTrackingPropsRawData } from '../types'
*/ */
class ClearTrackingProps { class ClearTrackingProps {
@ -11,27 +11,12 @@ class ClearTrackingProps {
/** /**
* @param {any} other * @param {any} other
* @returns {other is ClearTrackingProps} * @returns {boolean}
*/ */
equals(other) { equals(other) {
return other instanceof ClearTrackingProps return other instanceof ClearTrackingProps
} }
/**
* @param {TrackingDirective} other
* @returns {other is ClearTrackingProps}
*/
canMergeWith(other) {
return other instanceof ClearTrackingProps
}
/**
* @param {TrackingDirective} other
*/
mergeWith(other) {
return this
}
/** /**
* @returns {ClearTrackingPropsRawData} * @returns {ClearTrackingPropsRawData}
*/ */

View file

@ -11,7 +11,7 @@ const EditOperation = require('../operation/edit_operation')
const EditOperationBuilder = require('../operation/edit_operation_builder') const EditOperationBuilder = require('../operation/edit_operation_builder')
/** /**
* @import { BlobStore, ReadonlyBlobStore, RangesBlob, RawHashFileData, RawLazyStringFileData } from '../types' * @import { BlobStore, ReadonlyBlobStore, RangesBlob, RawFileData, RawLazyStringFileData } from '../types'
*/ */
class LazyStringFileData extends FileData { class LazyStringFileData extends FileData {
@ -159,11 +159,11 @@ class LazyStringFileData extends FileData {
/** @inheritdoc /** @inheritdoc
* @param {BlobStore} blobStore * @param {BlobStore} blobStore
* @return {Promise<RawHashFileData>} * @return {Promise<RawFileData>}
*/ */
async store(blobStore) { async store(blobStore) {
if (this.operations.length === 0) { if (this.operations.length === 0) {
/** @type RawHashFileData */ /** @type RawFileData */
const raw = { hash: this.hash } const raw = { hash: this.hash }
if (this.rangesHash) { if (this.rangesHash) {
raw.rangesHash = this.rangesHash raw.rangesHash = this.rangesHash
@ -171,11 +171,9 @@ class LazyStringFileData extends FileData {
return raw return raw
} }
const eager = await this.toEager(blobStore) const eager = await this.toEager(blobStore)
const raw = await eager.store(blobStore)
this.hash = raw.hash
this.rangesHash = raw.rangesHash
this.operations.length = 0 this.operations.length = 0
return raw /** @type RawFileData */
return await eager.store(blobStore)
} }
} }

View file

@ -8,7 +8,7 @@ const CommentList = require('./comment_list')
const TrackedChangeList = require('./tracked_change_list') const TrackedChangeList = require('./tracked_change_list')
/** /**
* @import { StringFileRawData, RawHashFileData, BlobStore, CommentRawData } from "../types" * @import { StringFileRawData, RawFileData, BlobStore, CommentRawData } from "../types"
* @import { TrackedChangeRawData, RangesBlob } from "../types" * @import { TrackedChangeRawData, RangesBlob } from "../types"
* @import EditOperation from "../operation/edit_operation" * @import EditOperation from "../operation/edit_operation"
*/ */
@ -88,14 +88,6 @@ class StringFileData extends FileData {
return content return content
} }
/**
* Return docstore view of a doc: each line separated
* @return {string[]}
*/
getLines() {
return this.getContent({ filterTrackedDeletes: true }).split('\n')
}
/** @inheritdoc */ /** @inheritdoc */
getByteLength() { getByteLength() {
return Buffer.byteLength(this.content) return Buffer.byteLength(this.content)
@ -139,7 +131,7 @@ class StringFileData extends FileData {
/** /**
* @inheritdoc * @inheritdoc
* @param {BlobStore} blobStore * @param {BlobStore} blobStore
* @return {Promise<RawHashFileData>} * @return {Promise<RawFileData>}
*/ */
async store(blobStore) { async store(blobStore) {
const blob = await blobStore.putString(this.content) const blob = await blobStore.putString(this.content)

View file

@ -84,21 +84,6 @@ class TrackedChange {
) )
) )
} }
/**
* Return an equivalent tracked change whose extent is limited to the given
* range
*
* @param {Range} range
* @returns {TrackedChange | null} - the result or null if the intersection is empty
*/
intersectRange(range) {
const intersection = this.range.intersect(range)
if (intersection == null) {
return null
}
return new TrackedChange(intersection, this.tracking)
}
} }
module.exports = TrackedChange module.exports = TrackedChange

View file

@ -2,11 +2,9 @@
const Range = require('../range') const Range = require('../range')
const TrackedChange = require('./tracked_change') const TrackedChange = require('./tracked_change')
const TrackingProps = require('../file_data/tracking_props') const TrackingProps = require('../file_data/tracking_props')
const { InsertOp, RemoveOp, RetainOp } = require('../operation/scan_op')
/** /**
* @import { TrackingDirective, TrackedChangeRawData } from "../types" * @import { TrackingDirective, TrackedChangeRawData } from "../types"
* @import TextOperation from "../operation/text_operation"
*/ */
class TrackedChangeList { class TrackedChangeList {
@ -60,22 +58,6 @@ class TrackedChangeList {
return this._trackedChanges.filter(change => range.contains(change.range)) return this._trackedChanges.filter(change => range.contains(change.range))
} }
/**
* Returns tracked changes that overlap with the given range
* @param {Range} range
* @returns {TrackedChange[]}
*/
intersectRange(range) {
const changes = []
for (const change of this._trackedChanges) {
const intersection = change.intersectRange(range)
if (intersection != null) {
changes.push(intersection)
}
}
return changes
}
/** /**
* Returns the tracking props for a given range. * Returns the tracking props for a given range.
* @param {Range} range * @param {Range} range
@ -107,8 +89,6 @@ class TrackedChangeList {
/** /**
* Collapses consecutive (and compatible) ranges * Collapses consecutive (and compatible) ranges
*
* @private
* @returns {void} * @returns {void}
*/ */
_mergeRanges() { _mergeRanges() {
@ -137,28 +117,12 @@ class TrackedChangeList {
} }
/** /**
* Apply an insert operation
* *
* @param {number} cursor * @param {number} cursor
* @param {string} insertedText * @param {string} insertedText
* @param {{tracking?: TrackingProps}} opts * @param {{tracking?: TrackingProps}} opts
*/ */
applyInsert(cursor, insertedText, opts = {}) { applyInsert(cursor, insertedText, opts = {}) {
this._applyInsert(cursor, insertedText, opts)
this._mergeRanges()
}
/**
* Apply an insert operation
*
* This method will not merge ranges at the end
*
* @private
* @param {number} cursor
* @param {string} insertedText
* @param {{tracking?: TrackingProps}} [opts]
*/
_applyInsert(cursor, insertedText, opts = {}) {
const newTrackedChanges = [] const newTrackedChanges = []
for (const trackedChange of this._trackedChanges) { for (const trackedChange of this._trackedChanges) {
if ( if (
@ -207,29 +171,15 @@ class TrackedChangeList {
newTrackedChanges.push(newTrackedChange) newTrackedChanges.push(newTrackedChange)
} }
this._trackedChanges = newTrackedChanges this._trackedChanges = newTrackedChanges
this._mergeRanges()
} }
/** /**
* Apply a delete operation to the list of tracked changes
* *
* @param {number} cursor * @param {number} cursor
* @param {number} length * @param {number} length
*/ */
applyDelete(cursor, length) { applyDelete(cursor, length) {
this._applyDelete(cursor, length)
this._mergeRanges()
}
/**
* Apply a delete operation to the list of tracked changes
*
* This method will not merge ranges at the end
*
* @private
* @param {number} cursor
* @param {number} length
*/
_applyDelete(cursor, length) {
const newTrackedChanges = [] const newTrackedChanges = []
for (const trackedChange of this._trackedChanges) { for (const trackedChange of this._trackedChanges) {
const deletedRange = new Range(cursor, length) const deletedRange = new Range(cursor, length)
@ -255,31 +205,15 @@ class TrackedChangeList {
} }
} }
this._trackedChanges = newTrackedChanges this._trackedChanges = newTrackedChanges
}
/**
* Apply a retain operation to the list of tracked changes
*
* @param {number} cursor
* @param {number} length
* @param {{tracking?: TrackingDirective}} [opts]
*/
applyRetain(cursor, length, opts = {}) {
this._applyRetain(cursor, length, opts)
this._mergeRanges() this._mergeRanges()
} }
/** /**
* Apply a retain operation to the list of tracked changes
*
* This method will not merge ranges at the end
*
* @private
* @param {number} cursor * @param {number} cursor
* @param {number} length * @param {number} length
* @param {{tracking?: TrackingDirective}} opts * @param {{tracking?: TrackingDirective}} opts
*/ */
_applyRetain(cursor, length, opts = {}) { applyRetain(cursor, length, opts = {}) {
// If there's no tracking info, leave everything as-is // If there's no tracking info, leave everything as-is
if (!opts.tracking) { if (!opts.tracking) {
return return
@ -335,31 +269,6 @@ class TrackedChangeList {
newTrackedChanges.push(newTrackedChange) newTrackedChanges.push(newTrackedChange)
} }
this._trackedChanges = newTrackedChanges this._trackedChanges = newTrackedChanges
}
/**
* Apply a text operation to the list of tracked changes
*
* Ranges are merged only once at the end, for performance and to avoid
* problematic edge cases where intermediate ranges get incorrectly merged.
*
* @param {TextOperation} operation
*/
applyTextOperation(operation) {
// this cursor tracks the destination document that gets modified as
// operations are applied to it.
let cursor = 0
for (const op of operation.ops) {
if (op instanceof InsertOp) {
this._applyInsert(cursor, op.insertion, { tracking: op.tracking })
cursor += op.insertion.length
} else if (op instanceof RemoveOp) {
this._applyDelete(cursor, op.length)
} else if (op instanceof RetainOp) {
this._applyRetain(cursor, op.length, { tracking: op.tracking })
cursor += op.length
}
}
this._mergeRanges() this._mergeRanges()
} }
} }

View file

@ -62,35 +62,6 @@ class TrackingProps {
this.ts.getTime() === other.ts.getTime() this.ts.getTime() === other.ts.getTime()
) )
} }
/**
* Are these tracking props compatible with the other tracking props for merging
* ranges?
*
* @param {TrackingDirective} other
* @returns {other is TrackingProps}
*/
canMergeWith(other) {
if (!(other instanceof TrackingProps)) {
return false
}
return this.type === other.type && this.userId === other.userId
}
/**
* Merge two tracking props
*
* Assumes that `canMerge(other)` returns true
*
* @param {TrackingDirective} other
*/
mergeWith(other) {
if (!this.canMergeWith(other)) {
throw new Error('Cannot merge with incompatible tracking props')
}
const ts = this.ts <= other.ts ? this.ts : other.ts
return new TrackingProps(this.type, this.userId, ts)
}
} }
module.exports = TrackingProps module.exports = TrackingProps

View file

@ -22,7 +22,7 @@ class NonUniquePathnameError extends PathnameError {
* @param {string[]} pathnames * @param {string[]} pathnames
*/ */
constructor(pathnames) { constructor(pathnames) {
super('pathnames are not unique', { pathnames }) super('pathnames are not unique: ' + pathnames, { pathnames })
this.pathnames = pathnames this.pathnames = pathnames
} }
} }
@ -30,13 +30,9 @@ class NonUniquePathnameError extends PathnameError {
class BadPathnameError extends PathnameError { class BadPathnameError extends PathnameError {
/** /**
* @param {string} pathname * @param {string} pathname
* @param {string} reason
*/ */
constructor(pathname, reason) { constructor(pathname) {
if (pathname.length > 10) { super(pathname + ' is not a valid pathname', { pathname })
pathname = pathname.slice(0, 5) + '...' + pathname.slice(-5)
}
super('invalid pathname', { reason, pathname })
this.pathname = pathname this.pathname = pathname
} }
} }
@ -46,7 +42,7 @@ class PathnameConflictError extends PathnameError {
* @param {string} pathname * @param {string} pathname
*/ */
constructor(pathname) { constructor(pathname) {
super('pathname conflicts with another file', { pathname }) super(`pathname '${pathname}' conflicts with another file`, { pathname })
this.pathname = pathname this.pathname = pathname
} }
} }
@ -56,7 +52,7 @@ class FileNotFoundError extends PathnameError {
* @param {string} pathname * @param {string} pathname
*/ */
constructor(pathname) { constructor(pathname) {
super('file does not exist', { pathname }) super(`file ${pathname} does not exist`, { pathname })
this.pathname = pathname this.pathname = pathname
} }
} }
@ -319,9 +315,8 @@ function checkPathnamesAreUnique(files) {
*/ */
function checkPathname(pathname) { function checkPathname(pathname) {
assert.nonEmptyString(pathname, 'bad pathname') assert.nonEmptyString(pathname, 'bad pathname')
const [isClean, reason] = safePathname.isCleanDebug(pathname) if (safePathname.isClean(pathname)) return
if (isClean) return throw new FileMap.BadPathnameError(pathname)
throw new FileMap.BadPathnameError(pathname, reason)
} }
/** /**

View file

@ -7,7 +7,7 @@ const Change = require('./change')
const Snapshot = require('./snapshot') const Snapshot = require('./snapshot')
/** /**
* @import { BlobStore, ReadonlyBlobStore } from "./types" * @import { BlobStore } from "./types"
*/ */
class History { class History {
@ -85,7 +85,7 @@ class History {
* If this History contains any File objects, load them. * If this History contains any File objects, load them.
* *
* @param {string} kind see {File#load} * @param {string} kind see {File#load}
* @param {ReadonlyBlobStore} blobStore * @param {BlobStore} blobStore
* @return {Promise<void>} * @return {Promise<void>}
*/ */
async loadFiles(kind, blobStore) { async loadFiles(kind, blobStore) {

View file

@ -36,20 +36,6 @@ class EditOperationBuilder {
} }
throw new Error('Unsupported operation in EditOperationBuilder.fromJSON') throw new Error('Unsupported operation in EditOperationBuilder.fromJSON')
} }
/**
* @param {unknown} raw
* @return {raw is RawEditOperation}
*/
static isValid(raw) {
return (
isTextOperation(raw) ||
isRawAddCommentOperation(raw) ||
isRawDeleteCommentOperation(raw) ||
isRawSetCommentStateOperation(raw) ||
isRawEditNoOperation(raw)
)
}
} }
/** /**

View file

@ -13,7 +13,7 @@ let EditFileOperation = null
let SetFileMetadataOperation = null let SetFileMetadataOperation = null
/** /**
* @import { ReadonlyBlobStore } from "../types" * @import { BlobStore } from "../types"
* @import Snapshot from "../snapshot" * @import Snapshot from "../snapshot"
*/ */
@ -80,7 +80,7 @@ class Operation {
* If this operation references any files, load the files. * If this operation references any files, load the files.
* *
* @param {string} kind see {File#load} * @param {string} kind see {File#load}
* @param {ReadOnlyBlobStore} blobStore * @param {BlobStore} blobStore
* @return {Promise<void>} * @return {Promise<void>}
*/ */
async loadFiles(kind, blobStore) {} async loadFiles(kind, blobStore) {}

View file

@ -175,7 +175,7 @@ class InsertOp extends ScanOp {
return false return false
} }
if (this.tracking) { if (this.tracking) {
if (!other.tracking || !this.tracking.canMergeWith(other.tracking)) { if (!this.tracking.equals(other.tracking)) {
return false return false
} }
} else if (other.tracking) { } else if (other.tracking) {
@ -198,10 +198,7 @@ class InsertOp extends ScanOp {
throw new Error('Cannot merge with incompatible operation') throw new Error('Cannot merge with incompatible operation')
} }
this.insertion += other.insertion this.insertion += other.insertion
if (this.tracking != null && other.tracking != null) { // We already have the same tracking info and commentIds
this.tracking = this.tracking.mergeWith(other.tracking)
}
// We already have the same commentIds
} }
/** /**
@ -309,13 +306,9 @@ class RetainOp extends ScanOp {
return false return false
} }
if (this.tracking) { if (this.tracking) {
if (!other.tracking || !this.tracking.canMergeWith(other.tracking)) { return this.tracking.equals(other.tracking)
return false
}
} else if (other.tracking) {
return false
} }
return true return !other.tracking
} }
/** /**
@ -326,9 +319,6 @@ class RetainOp extends ScanOp {
throw new Error('Cannot merge with incompatible operation') throw new Error('Cannot merge with incompatible operation')
} }
this.length += other.length this.length += other.length
if (this.tracking != null && other.tracking != null) {
this.tracking = this.tracking.mergeWith(other.tracking)
}
} }
/** /**

View file

@ -56,34 +56,18 @@ class TextOperation extends EditOperation {
constructor() { constructor() {
super() super()
// When an operation is applied to an input string, you can think of this as
/** // if an imaginary cursor runs over the entire string and skips over some
* When an operation is applied to an input string, you can think of this as // parts, removes some parts and inserts characters at some positions. These
* if an imaginary cursor runs over the entire string and skips over some // actions (skip/remove/insert) are stored as an array in the "ops" property.
* parts, removes some parts and inserts characters at some positions. These /** @type {ScanOp[]} */
* actions (skip/remove/insert) are stored as an array in the "ops" property.
* @type {ScanOp[]}
*/
this.ops = [] this.ops = []
// An operation's baseLength is the length of every string the operation
/** // can be applied to.
* An operation's baseLength is the length of every string the operation
* can be applied to.
*/
this.baseLength = 0 this.baseLength = 0
// The targetLength is the length of every string that results from applying
/** // the operation on a valid input string.
* The targetLength is the length of every string that results from applying
* the operation on a valid input string.
*/
this.targetLength = 0 this.targetLength = 0
/**
* The expected content hash after this operation is applied
*
* @type {string | null}
*/
this.contentHash = null
} }
/** /**
@ -239,12 +223,7 @@ class TextOperation extends EditOperation {
* @returns {RawTextOperation} * @returns {RawTextOperation}
*/ */
toJSON() { toJSON() {
/** @type {RawTextOperation} */ return { textOperation: this.ops.map(op => op.toJSON()) }
const json = { textOperation: this.ops.map(op => op.toJSON()) }
if (this.contentHash != null) {
json.contentHash = this.contentHash
}
return json
} }
/** /**
@ -252,7 +231,7 @@ class TextOperation extends EditOperation {
* @param {RawTextOperation} obj * @param {RawTextOperation} obj
* @returns {TextOperation} * @returns {TextOperation}
*/ */
static fromJSON = function ({ textOperation: ops, contentHash }) { static fromJSON = function ({ textOperation: ops }) {
const o = new TextOperation() const o = new TextOperation()
for (const op of ops) { for (const op of ops) {
if (isRetain(op)) { if (isRetain(op)) {
@ -271,9 +250,6 @@ class TextOperation extends EditOperation {
throw new UnprocessableError('unknown operation: ' + JSON.stringify(op)) throw new UnprocessableError('unknown operation: ' + JSON.stringify(op))
} }
} }
if (contentHash != null) {
o.contentHash = contentHash
}
return o return o
} }
@ -314,18 +290,25 @@ class TextOperation extends EditOperation {
str str
) )
} }
file.trackedChanges.applyRetain(result.length, op.length, {
tracking: op.tracking,
})
result += str.slice(inputCursor, inputCursor + op.length) result += str.slice(inputCursor, inputCursor + op.length)
inputCursor += op.length inputCursor += op.length
} else if (op instanceof InsertOp) { } else if (op instanceof InsertOp) {
if (containsNonBmpChars(op.insertion)) { if (containsNonBmpChars(op.insertion)) {
throw new InvalidInsertionError(str, op.toJSON()) throw new InvalidInsertionError(str, op.toJSON())
} }
file.trackedChanges.applyInsert(result.length, op.insertion, {
tracking: op.tracking,
})
file.comments.applyInsert( file.comments.applyInsert(
new Range(result.length, op.insertion.length), new Range(result.length, op.insertion.length),
{ commentIds: op.commentIds } { commentIds: op.commentIds }
) )
result += op.insertion result += op.insertion
} else if (op instanceof RemoveOp) { } else if (op instanceof RemoveOp) {
file.trackedChanges.applyDelete(result.length, op.length)
file.comments.applyDelete(new Range(result.length, op.length)) file.comments.applyDelete(new Range(result.length, op.length))
inputCursor += op.length inputCursor += op.length
} else { } else {
@ -345,8 +328,6 @@ class TextOperation extends EditOperation {
throw new TextOperation.TooLongError(operation, result.length) throw new TextOperation.TooLongError(operation, result.length)
} }
file.trackedChanges.applyTextOperation(this)
file.content = result file.content = result
} }
@ -395,36 +376,44 @@ class TextOperation extends EditOperation {
for (let i = 0, l = ops.length; i < l; i++) { for (let i = 0, l = ops.length; i < l; i++) {
const op = ops[i] const op = ops[i]
if (op instanceof RetainOp) { if (op instanceof RetainOp) {
if (op.tracking) { // Where we need to end up after the retains
// Where we need to end up after the retains const target = strIndex + op.length
const target = strIndex + op.length // A previous retain could have overriden some tracking info. Now we
// A previous retain could have overriden some tracking info. Now we // need to restore it.
// need to restore it. const previousRanges = previousState.trackedChanges.inRange(
const previousChanges = previousState.trackedChanges.intersectRange( new Range(strIndex, op.length)
new Range(strIndex, op.length) )
)
for (const change of previousChanges) { let removeTrackingInfoIfNeeded
if (strIndex < change.range.start) { if (op.tracking) {
inverse.retain(change.range.start - strIndex, { removeTrackingInfoIfNeeded = new ClearTrackingProps()
tracking: new ClearTrackingProps(), }
})
strIndex = change.range.start for (const trackedChange of previousRanges) {
} if (strIndex < trackedChange.range.start) {
inverse.retain(change.range.length, { inverse.retain(trackedChange.range.start - strIndex, {
tracking: change.tracking, tracking: removeTrackingInfoIfNeeded,
}) })
strIndex += change.range.length strIndex = trackedChange.range.start
} }
if (strIndex < target) { if (trackedChange.range.end < strIndex + op.length) {
inverse.retain(target - strIndex, { inverse.retain(trackedChange.range.length, {
tracking: new ClearTrackingProps(), tracking: trackedChange.tracking,
}) })
strIndex = target strIndex = trackedChange.range.end
} }
} else { if (trackedChange.range.end !== strIndex) {
inverse.retain(op.length) // No need to split the range at the end
strIndex += op.length const [left] = trackedChange.range.splitAt(strIndex)
inverse.retain(left.length, { tracking: trackedChange.tracking })
strIndex = left.end
}
}
if (strIndex < target) {
inverse.retain(target - strIndex, {
tracking: removeTrackingInfoIfNeeded,
})
strIndex = target
} }
} else if (op instanceof InsertOp) { } else if (op instanceof InsertOp) {
inverse.remove(op.insertion.length) inverse.remove(op.insertion.length)

View file

@ -86,32 +86,10 @@ class Range {
} }
/** /**
* Does this range overlap another range? * @param {Range} range
*
* Overlapping means that the two ranges have at least one character in common
*
* @param {Range} other - the other range
*/ */
overlaps(other) { overlaps(range) {
return this.start < other.end && this.end > other.start return this.start < range.end && this.end > range.start
}
/**
* Does this range overlap the start of another range?
*
* @param {Range} other - the other range
*/
overlapsStart(other) {
return this.start <= other.start && this.end > other.start
}
/**
* Does this range overlap the end of another range?
*
* @param {Range} other - the other range
*/
overlapsEnd(other) {
return this.start < other.end && this.end >= other.end
} }
/** /**
@ -249,26 +227,6 @@ class Range {
) )
return [rangeUpToCursor, rangeAfterCursor] return [rangeUpToCursor, rangeAfterCursor]
} }
/**
* Returns the intersection of this range with another range
*
* @param {Range} other - the other range
* @return {Range | null} the intersection or null if the intersection is empty
*/
intersect(other) {
if (this.contains(other)) {
return other
} else if (other.contains(this)) {
return this
} else if (other.overlapsStart(this)) {
return new Range(this.pos, other.end - this.start)
} else if (other.overlapsEnd(this)) {
return new Range(other.pos, this.end - other.start)
} else {
return null
}
}
} }
module.exports = Range module.exports = Range

View file

@ -64,57 +64,17 @@ function cleanPart(filename) {
* @return {String} * @return {String}
*/ */
exports.clean = function (pathname) { exports.clean = function (pathname) {
return exports.cleanDebug(pathname)[0]
}
/**
* See clean
* @param {string} pathname
* @return {[string,string]}
*/
exports.cleanDebug = function (pathname) {
let prev = pathname
let reason = ''
/**
* @param {string} label
*/
function recordReasonIfChanged(label) {
if (pathname === prev) return
if (reason) reason += ','
reason += label
prev = pathname
}
pathname = path.normalize(pathname) pathname = path.normalize(pathname)
recordReasonIfChanged('normalize') pathname = pathname.replace(/\\/g, '/') // workaround for IE
pathname = pathname.replace(/\/+/g, '/') // no multiple slashes
pathname = pathname.replace(/\\/g, '/') pathname = pathname.replace(/^(\/.*)$/, '_$1') // no leading /
recordReasonIfChanged('workaround for IE') pathname = pathname.replace(/^(.+)\/$/, '$1') // no trailing /
pathname = pathname.replace(/^ *(.*)$/, '$1') // no leading spaces
pathname = pathname.replace(/\/+/g, '/') pathname = pathname.replace(/^(.*[^ ]) *$/, '$1') // no trailing spaces
recordReasonIfChanged('no multiple slashes')
pathname = pathname.replace(/^(\/.*)$/, '_$1')
recordReasonIfChanged('no leading /')
pathname = pathname.replace(/^(.+)\/$/, '$1')
recordReasonIfChanged('no trailing /')
pathname = pathname.replace(/^ *(.*)$/, '$1')
recordReasonIfChanged('no leading spaces')
pathname = pathname.replace(/^(.*[^ ]) *$/, '$1')
recordReasonIfChanged('no trailing spaces')
if (pathname.length === 0) pathname = '_' if (pathname.length === 0) pathname = '_'
recordReasonIfChanged('empty')
pathname = pathname.split('/').map(cleanPart).join('/') pathname = pathname.split('/').map(cleanPart).join('/')
recordReasonIfChanged('cleanPart')
pathname = pathname.replace(BLOCKED_FILE_RX, '@$1') pathname = pathname.replace(BLOCKED_FILE_RX, '@$1')
recordReasonIfChanged('BLOCKED_FILE_RX') return pathname
return [pathname, reason]
} }
/** /**
@ -124,19 +84,9 @@ exports.cleanDebug = function (pathname) {
* @return {Boolean} * @return {Boolean}
*/ */
exports.isClean = function pathnameIsClean(pathname) { exports.isClean = function pathnameIsClean(pathname) {
return exports.isCleanDebug(pathname)[0] return (
} exports.clean(pathname) === pathname &&
pathname.length <= MAX_PATH &&
/** pathname.length > 0
* A pathname is clean (see clean) and not too long. )
*
* @param {string} pathname
* @return {[boolean,string]}
*/
exports.isCleanDebug = function (pathname) {
if (pathname.length > MAX_PATH) return [false, 'MAX_PATH']
if (pathname.length === 0) return [false, 'empty']
const [cleanPathname, reason] = exports.cleanDebug(pathname)
if (cleanPathname !== pathname) return [false, reason]
return [true, '']
} }

View file

@ -224,7 +224,7 @@ class Snapshot {
* *
* @param {string} kind see {File#load} * @param {string} kind see {File#load}
* @param {ReadonlyBlobStore} blobStore * @param {ReadonlyBlobStore} blobStore
* @return {Promise<Record<string, File>>} an object where keys are the pathnames and * @return {Promise<Object>} an object where keys are the pathnames and
* values are the files in the snapshot * values are the files in the snapshot
*/ */
async loadFiles(kind, blobStore) { async loadFiles(kind, blobStore) {

View file

@ -132,7 +132,6 @@ export type RawScanOp = RawInsertOp | RawRemoveOp | RawRetainOp
export type RawTextOperation = { export type RawTextOperation = {
textOperation: RawScanOp[] textOperation: RawScanOp[]
contentHash?: string
} }
export type RawAddCommentOperation = { export type RawAddCommentOperation = {

View file

@ -20,7 +20,7 @@
"@types/check-types": "^7.3.7", "@types/check-types": "^7.3.7",
"@types/path-browserify": "^1.0.2", "@types/path-browserify": "^1.0.2",
"chai": "^3.3.0", "chai": "^3.3.0",
"mocha": "^11.1.0", "mocha": "^10.2.0",
"sinon": "^9.2.4", "sinon": "^9.2.4",
"typescript": "^5.0.4" "typescript": "^5.0.4"
}, },

View file

@ -193,13 +193,4 @@ describe('LazyStringFileData', function () {
expect(fileData.getStringLength()).to.equal(longString.length) expect(fileData.getStringLength()).to.equal(longString.length)
expect(fileData.getOperations()).to.have.length(1) expect(fileData.getOperations()).to.have.length(1)
}) })
it('truncates its operations after being stored', async function () {
const testHash = File.EMPTY_FILE_HASH
const fileData = new LazyStringFileData(testHash, undefined, 0)
fileData.edit(new TextOperation().insert('abc'))
const stored = await fileData.store(this.blobStore)
expect(fileData.hash).to.equal(stored.hash)
expect(fileData.operations).to.deep.equal([])
})
}) })

View file

@ -1,3 +1,4 @@
// @ts-check
'use strict' 'use strict'
const { expect } = require('chai') const { expect } = require('chai')
@ -448,44 +449,4 @@ describe('Range', function () {
expect(() => range.insertAt(16, 3)).to.throw() expect(() => range.insertAt(16, 3)).to.throw()
}) })
}) })
describe('intersect', function () {
it('should handle partially overlapping ranges', function () {
const range1 = new Range(5, 10)
const range2 = new Range(3, 6)
const intersection1 = range1.intersect(range2)
expect(intersection1.pos).to.equal(5)
expect(intersection1.length).to.equal(4)
const intersection2 = range2.intersect(range1)
expect(intersection2.pos).to.equal(5)
expect(intersection2.length).to.equal(4)
})
it('should intersect with itself', function () {
const range = new Range(5, 10)
const intersection = range.intersect(range)
expect(intersection.pos).to.equal(5)
expect(intersection.length).to.equal(10)
})
it('should handle nested ranges', function () {
const range1 = new Range(5, 10)
const range2 = new Range(7, 2)
const intersection1 = range1.intersect(range2)
expect(intersection1.pos).to.equal(7)
expect(intersection1.length).to.equal(2)
const intersection2 = range2.intersect(range1)
expect(intersection2.pos).to.equal(7)
expect(intersection2.length).to.equal(2)
})
it('should handle disconnected ranges', function () {
const range1 = new Range(5, 10)
const range2 = new Range(20, 30)
const intersection1 = range1.intersect(range2)
expect(intersection1).to.be.null
const intersection2 = range2.intersect(range1)
expect(intersection2).to.be.null
})
})
}) })

View file

@ -5,11 +5,10 @@ const ot = require('..')
const safePathname = ot.safePathname const safePathname = ot.safePathname
describe('safePathname', function () { describe('safePathname', function () {
function expectClean(input, output, reason = '') { function expectClean(input, output) {
// check expected output and also idempotency // check expected output and also idempotency
const [cleanedInput, gotReason] = safePathname.cleanDebug(input) const cleanedInput = safePathname.clean(input)
expect(cleanedInput).to.equal(output) expect(cleanedInput).to.equal(output)
expect(gotReason).to.equal(reason)
expect(safePathname.clean(cleanedInput)).to.equal(cleanedInput) expect(safePathname.clean(cleanedInput)).to.equal(cleanedInput)
expect(safePathname.isClean(cleanedInput)).to.be.true expect(safePathname.isClean(cleanedInput)).to.be.true
} }
@ -23,56 +22,44 @@ describe('safePathname', function () {
expect(safePathname.isClean('rm -rf /')).to.be.falsy expect(safePathname.isClean('rm -rf /')).to.be.falsy
// replace invalid characters with underscores // replace invalid characters with underscores
expectClean( expectClean('test-s*\u0001\u0002m\u0007st\u0008.jpg', 'test-s___m_st_.jpg')
'test-s*\u0001\u0002m\u0007st\u0008.jpg',
'test-s___m_st_.jpg',
'cleanPart'
)
// keep slashes, normalize paths, replace .. // keep slashes, normalize paths, replace ..
expectClean('./foo', 'foo', 'normalize') expectClean('./foo', 'foo')
expectClean('../foo', '__/foo', 'cleanPart') expectClean('../foo', '__/foo')
expectClean('foo/./bar', 'foo/bar', 'normalize') expectClean('foo/./bar', 'foo/bar')
expectClean('foo/../bar', 'bar', 'normalize') expectClean('foo/../bar', 'bar')
expectClean('../../tricky/foo.bar', '__/__/tricky/foo.bar', 'cleanPart') expectClean('../../tricky/foo.bar', '__/__/tricky/foo.bar')
expectClean( expectClean('foo/../../tricky/foo.bar', '__/tricky/foo.bar')
'foo/../../tricky/foo.bar', expectClean('foo/bar/../../tricky/foo.bar', 'tricky/foo.bar')
'__/tricky/foo.bar', expectClean('foo/bar/baz/../../tricky/foo.bar', 'foo/tricky/foo.bar')
'normalize,cleanPart'
)
expectClean('foo/bar/../../tricky/foo.bar', 'tricky/foo.bar', 'normalize')
expectClean(
'foo/bar/baz/../../tricky/foo.bar',
'foo/tricky/foo.bar',
'normalize'
)
// remove illegal chars even when there is no extension // remove illegal chars even when there is no extension
expectClean('**foo', '__foo', 'cleanPart') expectClean('**foo', '__foo')
// remove windows file paths // remove windows file paths
expectClean('c:\\temp\\foo.txt', 'c:/temp/foo.txt', 'workaround for IE') expectClean('c:\\temp\\foo.txt', 'c:/temp/foo.txt')
// do not allow a leading slash (relative paths only) // do not allow a leading slash (relative paths only)
expectClean('/foo', '_/foo', 'no leading /') expectClean('/foo', '_/foo')
expectClean('//foo', '_/foo', 'normalize,no leading /') expectClean('//foo', '_/foo')
// do not allow multiple leading slashes // do not allow multiple leading slashes
expectClean('//foo', '_/foo', 'normalize,no leading /') expectClean('//foo', '_/foo')
// do not allow a trailing slash // do not allow a trailing slash
expectClean('/', '_', 'no leading /,no trailing /') expectClean('/', '_')
expectClean('foo/', 'foo', 'no trailing /') expectClean('foo/', 'foo')
expectClean('foo.tex/', 'foo.tex', 'no trailing /') expectClean('foo.tex/', 'foo.tex')
// do not allow multiple trailing slashes // do not allow multiple trailing slashes
expectClean('//', '_', 'normalize,no leading /,no trailing /') expectClean('//', '_')
expectClean('///', '_', 'normalize,no leading /,no trailing /') expectClean('///', '_')
expectClean('foo//', 'foo', 'normalize,no trailing /') expectClean('foo//', 'foo')
// file and folder names that consist of . and .. are not OK // file and folder names that consist of . and .. are not OK
expectClean('.', '_', 'cleanPart') expectClean('.', '_')
expectClean('..', '__', 'cleanPart') expectClean('..', '__')
// we will allow name with more dots e.g. ... and .... // we will allow name with more dots e.g. ... and ....
expectClean('...', '...') expectClean('...', '...')
expectClean('....', '....') expectClean('....', '....')
@ -95,10 +82,10 @@ describe('safePathname', function () {
expectClean('a b.png', 'a b.png') expectClean('a b.png', 'a b.png')
// leading and trailing spaces are not OK // leading and trailing spaces are not OK
expectClean(' foo', 'foo', 'no leading spaces') expectClean(' foo', 'foo')
expectClean(' foo', 'foo', 'no leading spaces') expectClean(' foo', 'foo')
expectClean('foo ', 'foo', 'no trailing spaces') expectClean('foo ', 'foo')
expectClean('foo ', 'foo', 'no trailing spaces') expectClean('foo ', 'foo')
// reserved file names on Windows should not be OK, but we already have // reserved file names on Windows should not be OK, but we already have
// some in the old system, so have to allow them for now // some in the old system, so have to allow them for now
@ -113,14 +100,14 @@ describe('safePathname', function () {
// there's no particular reason to allow multiple slashes; sometimes people // there's no particular reason to allow multiple slashes; sometimes people
// seem to rename files to URLs (https://domain/path) in an attempt to // seem to rename files to URLs (https://domain/path) in an attempt to
// upload a file, and this results in an empty directory name // upload a file, and this results in an empty directory name
expectClean('foo//bar.png', 'foo/bar.png', 'normalize') expectClean('foo//bar.png', 'foo/bar.png')
expectClean('foo///bar.png', 'foo/bar.png', 'normalize') expectClean('foo///bar.png', 'foo/bar.png')
// Check javascript property handling // Check javascript property handling
expectClean('foo/prototype', 'foo/prototype') // OK as part of a pathname expectClean('foo/prototype', 'foo/prototype') // OK as part of a pathname
expectClean('prototype/test.txt', 'prototype/test.txt') expectClean('prototype/test.txt', 'prototype/test.txt')
expectClean('prototype', '@prototype', 'BLOCKED_FILE_RX') // not OK as whole pathname expectClean('prototype', '@prototype') // not OK as whole pathname
expectClean('hasOwnProperty', '@hasOwnProperty', 'BLOCKED_FILE_RX') expectClean('hasOwnProperty', '@hasOwnProperty')
expectClean('**proto**', '@__proto__', 'cleanPart,BLOCKED_FILE_RX') expectClean('**proto**', '@__proto__')
}) })
}) })

View file

@ -107,7 +107,7 @@ describe('RetainOp', function () {
expect(op1.equals(new RetainOp(3))).to.be.true expect(op1.equals(new RetainOp(3))).to.be.true
}) })
it('cannot merge with another RetainOp if the tracking user is different', function () { it('cannot merge with another RetainOp if tracking info is different', function () {
const op1 = new RetainOp( const op1 = new RetainOp(
4, 4,
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z')) new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
@ -120,14 +120,14 @@ describe('RetainOp', function () {
expect(() => op1.mergeWith(op2)).to.throw(Error) expect(() => op1.mergeWith(op2)).to.throw(Error)
}) })
it('can merge with another RetainOp if the tracking user is the same', function () { it('can merge with another RetainOp if tracking info is the same', function () {
const op1 = new RetainOp( const op1 = new RetainOp(
4, 4,
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z')) new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
) )
const op2 = new RetainOp( const op2 = new RetainOp(
4, 4,
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:01.000Z')) new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
) )
op1.mergeWith(op2) op1.mergeWith(op2)
expect( expect(
@ -310,7 +310,7 @@ describe('InsertOp', function () {
expect(() => op1.mergeWith(op2)).to.throw(Error) expect(() => op1.mergeWith(op2)).to.throw(Error)
}) })
it('cannot merge with another InsertOp if tracking user is different', function () { it('cannot merge with another InsertOp if tracking info is different', function () {
const op1 = new InsertOp( const op1 = new InsertOp(
'a', 'a',
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z')) new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
@ -323,7 +323,7 @@ describe('InsertOp', function () {
expect(() => op1.mergeWith(op2)).to.throw(Error) expect(() => op1.mergeWith(op2)).to.throw(Error)
}) })
it('can merge with another InsertOp if tracking user and comment info is the same', function () { it('can merge with another InsertOp if tracking and comment info is the same', function () {
const op1 = new InsertOp( const op1 = new InsertOp(
'a', 'a',
new TrackingProps( new TrackingProps(
@ -338,7 +338,7 @@ describe('InsertOp', function () {
new TrackingProps( new TrackingProps(
'insert', 'insert',
'user1', 'user1',
new Date('2024-01-01T00:00:01.000Z') new Date('2024-01-01T00:00:00.000Z')
), ),
['1', '2'] ['1', '2']
) )

View file

@ -322,47 +322,6 @@ describe('TextOperation', function () {
new TextOperation().retain(4).remove(4).retain(3) new TextOperation().retain(4).remove(4).retain(3)
) )
}) })
it('undoing a tracked delete restores the tracked changes', function () {
expectInverseToLeadToInitialState(
new StringFileData(
'the quick brown fox jumps over the lazy dog',
undefined,
[
{
range: { pos: 5, length: 5 },
tracking: {
ts: '2023-01-01T00:00:00.000Z',
type: 'insert',
userId: 'user1',
},
},
{
range: { pos: 12, length: 3 },
tracking: {
ts: '2023-01-01T00:00:00.000Z',
type: 'delete',
userId: 'user1',
},
},
{
range: { pos: 18, length: 5 },
tracking: {
ts: '2023-01-01T00:00:00.000Z',
type: 'insert',
userId: 'user1',
},
},
]
),
new TextOperation()
.retain(7)
.retain(13, {
tracking: new TrackingProps('delete', 'user1', new Date()),
})
.retain(23)
)
})
}) })
describe('compose', function () { describe('compose', function () {

View file

@ -0,0 +1 @@
node_modules/

3
libraries/promise-utils/.gitignore vendored Normal file
View file

@ -0,0 +1,3 @@
# managed by monorepo$ bin/update_build_scripts
.npmrc

View file

@ -1 +1 @@
22.17.0 20.18.0

View file

@ -1,10 +1,10 @@
promise-utils promise-utils
--dependencies=None --dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker --docker-repos=gcr.io/overleaf-ops
--env-add= --env-add=
--env-pass-through= --env-pass-through=
--esmock-loader=False --esmock-loader=False
--is-library=True --is-library=True
--node-version=22.17.0 --node-version=20.18.0
--public-repo=False --public-repo=False
--script-version=4.7.0 --script-version=4.5.0

View file

@ -13,7 +13,6 @@ module.exports = {
expressify, expressify,
expressifyErrorHandler, expressifyErrorHandler,
promiseMapWithLimit, promiseMapWithLimit,
promiseMapSettledWithLimit,
} }
/** /**
@ -265,19 +264,3 @@ async function promiseMapWithLimit(concurrency, array, fn) {
const limit = pLimit(concurrency) const limit = pLimit(concurrency)
return await Promise.all(array.map(x => limit(() => fn(x)))) return await Promise.all(array.map(x => limit(() => fn(x))))
} }
/**
* Map values in `array` with the async function `fn`
*
* Limit the number of unresolved promises to `concurrency`.
*
* @template T, U
* @param {number} concurrency
* @param {Array<T>} array
* @param {(T) => Promise<U>} fn
* @return {Promise<Array<PromiseSettledResult<U>>>}
*/
function promiseMapSettledWithLimit(concurrency, array, fn) {
const limit = pLimit(concurrency)
return Promise.allSettled(array.map(x => limit(() => fn(x))))
}

View file

@ -18,7 +18,7 @@
"devDependencies": { "devDependencies": {
"chai": "^4.3.10", "chai": "^4.3.10",
"chai-as-promised": "^7.1.1", "chai-as-promised": "^7.1.1",
"mocha": "^11.1.0", "mocha": "^10.2.0",
"typescript": "^5.0.4" "typescript": "^5.0.4"
}, },
"dependencies": { "dependencies": {

View file

@ -0,0 +1 @@
node_modules/

13
libraries/ranges-tracker/.gitignore vendored Normal file
View file

@ -0,0 +1,13 @@
**.swp
app.js
app/js/
test/unit/js/
public/build/
node_modules/
/public/js/chat.js
plato/
.npmrc

View file

@ -1 +1 @@
22.17.0 20.18.0

View file

@ -1,10 +1,10 @@
ranges-tracker ranges-tracker
--dependencies=None --dependencies=None
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker --docker-repos=gcr.io/overleaf-ops
--env-add= --env-add=
--env-pass-through= --env-pass-through=
--esmock-loader=False --esmock-loader=False
--is-library=True --is-library=True
--node-version=22.17.0 --node-version=20.18.0
--public-repo=False --public-repo=False
--script-version=4.7.0 --script-version=4.5.0

View file

@ -145,7 +145,11 @@ class RangesTracker {
} }
removeChangeId(changeId) { removeChangeId(changeId) {
this.removeChangeIds([changeId]) const change = this.getChange(changeId)
if (change == null) {
return
}
this._removeChange(change)
} }
removeChangeIds(ids) { removeChangeIds(ids) {
@ -312,7 +316,7 @@ class RangesTracker {
const movedChanges = [] const movedChanges = []
const removeChanges = [] const removeChanges = []
const newChanges = [] const newChanges = []
const trackedDeletesAtOpPosition = []
for (let i = 0; i < this.changes.length; i++) { for (let i = 0; i < this.changes.length; i++) {
change = this.changes[i] change = this.changes[i]
const changeStart = change.op.p const changeStart = change.op.p
@ -323,15 +327,13 @@ class RangesTracker {
change.op.p += opLength change.op.p += opLength
movedChanges.push(change) movedChanges.push(change)
} else if (opStart === changeStart) { } else if (opStart === changeStart) {
// If we are undoing, then we want to cancel any existing delete ranges if we can.
// Check if the insert matches the start of the delete, and just remove it from the delete instead if so.
if ( if (
!alreadyMerged &&
undoing && undoing &&
change.op.d.length >= op.i.length && change.op.d.length >= op.i.length &&
change.op.d.slice(0, op.i.length) === op.i change.op.d.slice(0, op.i.length) === op.i
) { ) {
// If we are undoing, then we want to reject any existing tracked delete if we can.
// Check if the insert matches the start of the delete, and just
// remove it from the delete instead if so.
change.op.d = change.op.d.slice(op.i.length) change.op.d = change.op.d.slice(op.i.length)
change.op.p += op.i.length change.op.p += op.i.length
if (change.op.d === '') { if (change.op.d === '') {
@ -340,25 +342,9 @@ class RangesTracker {
movedChanges.push(change) movedChanges.push(change)
} }
alreadyMerged = true alreadyMerged = true
// Any tracked delete that came before this tracked delete
// rejection was moved after the incoming insert. Move them back
// so that they appear before the tracked delete rejection.
for (const trackedDelete of trackedDeletesAtOpPosition) {
trackedDelete.op.p -= opLength
}
} else { } else {
// We're not rejecting that tracked delete. Move it after the
// insert.
change.op.p += opLength change.op.p += opLength
movedChanges.push(change) movedChanges.push(change)
// Keep track of tracked deletes that are at the same position as the
// insert. If we find a tracked delete to reject, we'll want to
// reposition them.
if (!alreadyMerged) {
trackedDeletesAtOpPosition.push(change)
}
} }
} }
} else if (change.op.i != null) { } else if (change.op.i != null) {
@ -638,11 +624,9 @@ class RangesTracker {
} }
_addOp(op, metadata) { _addOp(op, metadata) {
// Don't take a reference to the existing op since we'll modify this in place with future changes
op = this._clone(op)
const change = { const change = {
id: this.newId(), id: this.newId(),
op, op: this._clone(op), // Don't take a reference to the existing op since we'll modify this in place with future changes
metadata: this._clone(metadata), metadata: this._clone(metadata),
} }
this.changes.push(change) this.changes.push(change)
@ -665,7 +649,7 @@ class RangesTracker {
} }
_removeChange(change) { _removeChange(change) {
this.changes = this.changes.filter(c => c !== change) this.changes = this.changes.filter(c => c.id !== change.id)
this._markAsDirty(change, 'change', 'removed') this._markAsDirty(change, 'change', 'removed')
} }

View file

@ -20,7 +20,7 @@
}, },
"devDependencies": { "devDependencies": {
"chai": "^4.3.6", "chai": "^4.3.6",
"mocha": "^11.1.0", "mocha": "^10.2.0",
"typescript": "^5.0.4" "typescript": "^5.0.4"
} }
} }

View file

@ -4,7 +4,6 @@ const RangesTracker = require('../..')
describe('RangesTracker', function () { describe('RangesTracker', function () {
describe('with duplicate change ids', function () { describe('with duplicate change ids', function () {
beforeEach(function () { beforeEach(function () {
this.comments = []
this.changes = [ this.changes = [
{ id: 'id1', op: { p: 1, i: 'hello' } }, { id: 'id1', op: { p: 1, i: 'hello' } },
{ id: 'id2', op: { p: 10, i: 'world' } }, { id: 'id2', op: { p: 10, i: 'world' } },
@ -27,199 +26,4 @@ describe('RangesTracker', function () {
expect(this.rangesTracker.changes).to.deep.equal([this.changes[2]]) expect(this.rangesTracker.changes).to.deep.equal([this.changes[2]])
}) })
}) })
describe('with duplicate tracked insert ids', function () {
beforeEach(function () {
this.comments = []
this.changes = [
{ id: 'id1', op: { p: 10, i: 'one' } },
{ id: 'id1', op: { p: 20, i: 'two' } },
{ id: 'id1', op: { p: 30, d: 'three' } },
]
this.rangesTracker = new RangesTracker(this.changes, this.comments)
})
it("deleting one tracked insert doesn't delete the others", function () {
this.rangesTracker.applyOp({ p: 20, d: 'two' })
expect(this.rangesTracker.changes).to.deep.equal([
this.changes[0],
this.changes[2],
])
})
})
describe('with duplicate tracked delete ids', function () {
beforeEach(function () {
this.comments = []
this.changes = [
{ id: 'id1', op: { p: 10, d: 'one' } },
{ id: 'id1', op: { p: 20, d: 'two' } },
{ id: 'id1', op: { p: 30, d: 'three' } },
]
this.rangesTracker = new RangesTracker(this.changes, this.comments)
})
it('deleting over tracked deletes in tracked changes mode removes the tracked deletes covered', function () {
this.rangesTracker.track_changes = true
this.rangesTracker.applyOp({
p: 15,
d: '567890123456789012345',
})
expect(this.rangesTracker.changes.map(c => c.op)).to.deep.equal([
{ p: 10, d: 'one' },
{ p: 15, d: '56789two0123456789three012345' },
])
})
it('a tracked delete between two tracked deletes joins them into a single tracked delete', function () {
this.rangesTracker.track_changes = true
this.rangesTracker.applyOp({
p: 20,
d: '0123456789',
})
expect(this.rangesTracker.changes.map(c => c.op)).to.deep.equal([
{ p: 10, d: 'one' },
{ p: 20, d: 'two0123456789three' },
])
})
it("rejecting one tracked delete doesn't reject the others", function () {
this.rangesTracker.track_changes = true
this.rangesTracker.applyOp({
p: 20,
i: 'two',
u: true,
})
expect(this.rangesTracker.changes.map(c => c.op)).to.deep.equal([
{ p: 10, d: 'one' },
{ p: 33, d: 'three' },
])
})
it("rejecting all tracked deletes doesn't introduce tracked inserts", function () {
this.rangesTracker.track_changes = true
this.rangesTracker.applyOp({
p: 10,
i: 'one',
u: true,
})
this.rangesTracker.applyOp({
p: 23,
i: 'two',
u: true,
})
this.rangesTracker.applyOp({
p: 36,
i: 'three',
u: true,
})
expect(this.rangesTracker.changes.map(c => c.op)).to.deep.equal([])
})
})
describe('with multiple tracked deletes at the same position', function () {
beforeEach(function () {
this.comments = []
this.changes = [
{ id: 'id1', op: { p: 33, d: 'before' } },
{ id: 'id2', op: { p: 50, d: 'right before' } },
{ id: 'id3', op: { p: 50, d: 'this one' } },
{ id: 'id4', op: { p: 50, d: 'right after' } },
{ id: 'id5', op: { p: 75, d: 'long after' } },
]
this.rangesTracker = new RangesTracker(this.changes, this.comments)
})
it('preserves the text order when rejecting changes', function () {
this.rangesTracker.applyOp(
{ p: 50, i: 'this one', u: true },
{ user_id: 'user-id' }
)
expect(this.rangesTracker.changes).to.deep.equal([
{ id: 'id1', op: { p: 33, d: 'before' } },
{ id: 'id2', op: { p: 50, d: 'right before' } },
{ id: 'id4', op: { p: 58, d: 'right after' } },
{ id: 'id5', op: { p: 83, d: 'long after' } },
])
})
it('moves all tracked deletes after the insert if not rejecting changes', function () {
this.rangesTracker.applyOp(
{ p: 50, i: 'some other text', u: true, orderedRejections: true },
{ user_id: 'user-id' }
)
expect(this.rangesTracker.changes).to.deep.equal([
{ id: 'id1', op: { p: 33, d: 'before' } },
{ id: 'id2', op: { p: 65, d: 'right before' } },
{ id: 'id3', op: { p: 65, d: 'this one' } },
{ id: 'id4', op: { p: 65, d: 'right after' } },
{ id: 'id5', op: { p: 90, d: 'long after' } },
])
})
})
describe('with multiple tracked deletes at the same position with the same content', function () {
beforeEach(function () {
this.comments = []
this.changes = [
{ id: 'id1', op: { p: 10, d: 'cat' } },
{ id: 'id2', op: { p: 10, d: 'giraffe' } },
{ id: 'id3', op: { p: 10, d: 'cat' } },
{ id: 'id4', op: { p: 10, d: 'giraffe' } },
]
this.rangesTracker = new RangesTracker(this.changes, this.comments)
})
it('removes only the first matching tracked delete', function () {
this.rangesTracker.applyOp(
{ p: 10, i: 'giraffe', u: true },
{ user_id: 'user-id' }
)
expect(this.rangesTracker.changes).to.deep.equal([
{ id: 'id1', op: { p: 10, d: 'cat' } },
{ id: 'id3', op: { p: 17, d: 'cat' } },
{ id: 'id4', op: { p: 17, d: 'giraffe' } },
])
})
})
describe('with a tracked insert at the same position as a tracked delete', function () {
beforeEach(function () {
this.comments = []
this.changes = [
{
id: 'id1',
op: { p: 5, d: 'before' },
metadata: { user_id: 'user-id' },
},
{
id: 'id2',
op: { p: 10, d: 'delete' },
metadata: { user_id: 'user-id' },
},
{
id: 'id3',
op: { p: 10, i: 'insert' },
metadata: { user_id: 'user-id' },
},
]
this.rangesTracker = new RangesTracker(this.changes, this.comments)
})
it('places a tracked insert at the same position before both the delete and the insert', function () {
this.rangesTracker.track_changes = true
this.rangesTracker.applyOp(
{ p: 10, i: 'incoming' },
{ user_id: 'user-id' }
)
expect(this.rangesTracker.changes.map(change => change.op)).to.deep.equal(
[
{ p: 5, d: 'before' },
{ p: 10, i: 'incoming' },
{ p: 18, d: 'delete' },
{ p: 18, i: 'insert' },
]
)
})
})
}) })

View file

@ -0,0 +1 @@
node_modules/

13
libraries/redis-wrapper/.gitignore vendored Normal file
View file

@ -0,0 +1,13 @@
**.swp
app.js
app/js/
test/unit/js/
public/build/
node_modules/
/public/js/chat.js
plato/
.npmrc

Some files were not shown because too many files have changed in this diff Show more