mirror of
https://github.com/yu-i-i/overleaf-cep.git
synced 2025-07-28 20:00:10 +02:00
Compare commits
No commits in common. "ext-ce" and "v5.4.1" have entirely different histories.
2276 changed files with 56978 additions and 103712 deletions
|
@ -1,19 +1,10 @@
|
||||||
---
|
|
||||||
name: Bug report
|
|
||||||
about: Report a bug
|
|
||||||
title: ''
|
|
||||||
labels: type:bug
|
|
||||||
assignees: ''
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
<!--
|
<!--
|
||||||
|
|
||||||
Note: If you are using www.overleaf.com and have a problem,
|
Note: If you are using www.overleaf.com and have a problem,
|
||||||
or if you would like to request a new feature please contact
|
or if you would like to request a new feature please contact
|
||||||
the support team at support@overleaf.com
|
the support team at support@overleaf.com
|
||||||
|
|
||||||
This form should only be used to report bugs in the
|
This form should only be used to report bugs in the
|
||||||
Community Edition release of Overleaf.
|
Community Edition release of Overleaf.
|
||||||
|
|
||||||
-->
|
-->
|
61
README.md
61
README.md
|
@ -14,52 +14,39 @@
|
||||||
<a href="#license">License</a>
|
<a href="#license">License</a>
|
||||||
</p>
|
</p>
|
||||||
|
|
||||||
<img src="doc/screenshot.png" alt="A screenshot of a project being edited in Overleaf Extended Community Edition">
|
<img src="doc/screenshot.png" alt="A screenshot of a project being edited in Overleaf Community Edition">
|
||||||
<p align="center">
|
<p align="center">
|
||||||
Figure 1: A screenshot of a project being edited in Overleaf Extended Community Edition.
|
Figure 1: A screenshot of a project being edited in Overleaf Community Edition.
|
||||||
</p>
|
</p>
|
||||||
|
|
||||||
## Community Edition
|
## Community Edition
|
||||||
|
|
||||||
[Overleaf](https://www.overleaf.com) is an open-source online real-time collaborative LaTeX editor. Overleaf runs a hosted version at [www.overleaf.com](https://www.overleaf.com), but you can also run your own local version, and contribute to the development of Overleaf.
|
[Overleaf](https://www.overleaf.com) is an open-source online real-time collaborative LaTeX editor. We run a hosted version at [www.overleaf.com](https://www.overleaf.com), but you can also run your own local version, and contribute to the development of Overleaf.
|
||||||
|
|
||||||
## Extended Community Edition
|
|
||||||
|
|
||||||
The present "extended" version of Overleaf CE includes:
|
|
||||||
|
|
||||||
- Template Gallery
|
|
||||||
- Sandboxed Compiles with TeX Live image selection
|
|
||||||
- LDAP authentication
|
|
||||||
- SAML authentication
|
|
||||||
- OpenID Connect authentication
|
|
||||||
- Real-time track changes and comments
|
|
||||||
- Autocomplete of reference keys
|
|
||||||
- Symbol Palette
|
|
||||||
- "From External URL" feature
|
|
||||||
|
|
||||||
> [!CAUTION]
|
|
||||||
> Overleaf Community Edition is intended for use in environments where **all** users are trusted. Community Edition is **not** appropriate for scenarios where isolation of users is required due to Sandbox Compiles not being available. When not using Sandboxed Compiles, users have full read and write access to the `sharelatex` container resources (filesystem, network, environment variables) when running LaTeX compiles.
|
|
||||||
Therefore, in any environment where not all users can be fully trusted, it is strongly recommended to enable the Sandboxed Compiles feature available in the Extended Community Edition.
|
|
||||||
|
|
||||||
For more information on Sandbox Compiles check out Overleaf [documentation](https://docs.overleaf.com/on-premises/configuration/overleaf-toolkit/server-pro-only-configuration/sandboxed-compiles).
|
|
||||||
|
|
||||||
## Enterprise
|
## Enterprise
|
||||||
|
|
||||||
If you want help installing and maintaining Overleaf in your lab or workplace, Overleaf offers an officially supported version called [Overleaf Server Pro](https://www.overleaf.com/for/enterprises).
|
If you want help installing and maintaining Overleaf in your lab or workplace, we offer an officially supported version called [Overleaf Server Pro](https://www.overleaf.com/for/enterprises). It also includes more features for security (SSO with LDAP or SAML), administration and collaboration (e.g. tracked changes). [Find out more!](https://www.overleaf.com/for/enterprises)
|
||||||
|
|
||||||
|
## Keeping up to date
|
||||||
|
|
||||||
|
Sign up to the [mailing list](https://mailchi.mp/overleaf.com/community-edition-and-server-pro) to get updates on Overleaf releases and development.
|
||||||
|
|
||||||
## Installation
|
## Installation
|
||||||
|
|
||||||
Detailed installation instructions can be found in the [Overleaf Toolkit](https://github.com/overleaf/toolkit/).
|
We have detailed installation instructions in the [Overleaf Toolkit](https://github.com/overleaf/toolkit/).
|
||||||
Configuration details and release history for the Extended Community Edition can be found on the [Extended CE Wiki Page](https://github.com/yu-i-i/overleaf-cep/wiki).
|
|
||||||
|
## Upgrading
|
||||||
|
|
||||||
|
If you are upgrading from a previous version of Overleaf, please see the [Release Notes section on the Wiki](https://github.com/overleaf/overleaf/wiki#release-notes) for all of the versions between your current version and the version you are upgrading to.
|
||||||
|
|
||||||
## Overleaf Docker Image
|
## Overleaf Docker Image
|
||||||
|
|
||||||
This repo contains two dockerfiles, [`Dockerfile-base`](server-ce/Dockerfile-base), which builds the
|
This repo contains two dockerfiles, [`Dockerfile-base`](server-ce/Dockerfile-base), which builds the
|
||||||
`sharelatex/sharelatex-base:ext-ce` image, and [`Dockerfile`](server-ce/Dockerfile) which builds the
|
`sharelatex/sharelatex-base` image, and [`Dockerfile`](server-ce/Dockerfile) which builds the
|
||||||
`sharelatex/sharelatex:ext-ce` image.
|
`sharelatex/sharelatex` (or "community") image.
|
||||||
|
|
||||||
The Base image generally contains the basic dependencies like `wget`, plus `texlive`.
|
The Base image generally contains the basic dependencies like `wget`, plus `texlive`.
|
||||||
This is split out because it's a pretty heavy set of
|
We split this out because it's a pretty heavy set of
|
||||||
dependencies, and it's nice to not have to rebuild all of that every time.
|
dependencies, and it's nice to not have to rebuild all of that every time.
|
||||||
|
|
||||||
The `sharelatex/sharelatex` image extends the base image and adds the actual Overleaf code
|
The `sharelatex/sharelatex` image extends the base image and adds the actual Overleaf code
|
||||||
|
@ -67,16 +54,20 @@ and services.
|
||||||
|
|
||||||
Use `make build-base` and `make build-community` from `server-ce/` to build these images.
|
Use `make build-base` and `make build-community` from `server-ce/` to build these images.
|
||||||
|
|
||||||
The [Phusion base-image](https://github.com/phusion/baseimage-docker)
|
We use the [Phusion base-image](https://github.com/phusion/baseimage-docker)
|
||||||
(which is extended by the `base` image) provides a VM-like container
|
(which is extended by our `base` image) to provide us with a VM-like container
|
||||||
in which to run the Overleaf services. Baseimage uses the `runit` service
|
in which to run the Overleaf services. Baseimage uses the `runit` service
|
||||||
manager to manage services, and init scripts from the `server-ce/runit`
|
manager to manage services, and we add our init-scripts from the `server-ce/runit`
|
||||||
folder are added.
|
folder.
|
||||||
|
|
||||||
|
|
||||||
|
## Contributing
|
||||||
|
|
||||||
|
Please see the [CONTRIBUTING](CONTRIBUTING.md) file for information on contributing to the development of Overleaf.
|
||||||
|
|
||||||
## Authors
|
## Authors
|
||||||
|
|
||||||
[The Overleaf Team](https://www.overleaf.com/about)
|
[The Overleaf Team](https://www.overleaf.com/about)
|
||||||
[yu-i-i](https://github.com/yu-i-i/overleaf-cep) — Extensions for CE unless otherwise noted
|
|
||||||
|
|
||||||
## License
|
## License
|
||||||
|
|
||||||
|
|
|
@ -42,7 +42,7 @@ To do this, use the included `bin/dev` script:
|
||||||
bin/dev
|
bin/dev
|
||||||
```
|
```
|
||||||
|
|
||||||
This will start all services using `node --watch`, which will automatically monitor the code and restart the services as necessary.
|
This will start all services using `nodemon`, which will automatically monitor the code and restart the services as necessary.
|
||||||
|
|
||||||
To improve performance, you can start only a subset of the services in development mode by providing a space-separated list to the `bin/dev` script:
|
To improve performance, you can start only a subset of the services in development mode by providing a space-separated list to the `bin/dev` script:
|
||||||
|
|
||||||
|
@ -77,7 +77,6 @@ each service:
|
||||||
| `filestore` | 9235 |
|
| `filestore` | 9235 |
|
||||||
| `notifications` | 9236 |
|
| `notifications` | 9236 |
|
||||||
| `real-time` | 9237 |
|
| `real-time` | 9237 |
|
||||||
| `references` | 9238 |
|
|
||||||
| `history-v1` | 9239 |
|
| `history-v1` | 9239 |
|
||||||
| `project-history` | 9240 |
|
| `project-history` | 9240 |
|
||||||
|
|
||||||
|
|
|
@ -6,18 +6,14 @@ DOCUMENT_UPDATER_HOST=document-updater
|
||||||
FILESTORE_HOST=filestore
|
FILESTORE_HOST=filestore
|
||||||
GRACEFUL_SHUTDOWN_DELAY_SECONDS=0
|
GRACEFUL_SHUTDOWN_DELAY_SECONDS=0
|
||||||
HISTORY_V1_HOST=history-v1
|
HISTORY_V1_HOST=history-v1
|
||||||
HISTORY_REDIS_HOST=redis
|
|
||||||
LISTEN_ADDRESS=0.0.0.0
|
LISTEN_ADDRESS=0.0.0.0
|
||||||
MONGO_HOST=mongo
|
MONGO_HOST=mongo
|
||||||
MONGO_URL=mongodb://mongo/sharelatex?directConnection=true
|
MONGO_URL=mongodb://mongo/sharelatex?directConnection=true
|
||||||
NOTIFICATIONS_HOST=notifications
|
NOTIFICATIONS_HOST=notifications
|
||||||
PROJECT_HISTORY_HOST=project-history
|
PROJECT_HISTORY_HOST=project-history
|
||||||
QUEUES_REDIS_HOST=redis
|
|
||||||
REALTIME_HOST=real-time
|
REALTIME_HOST=real-time
|
||||||
REDIS_HOST=redis
|
REDIS_HOST=redis
|
||||||
REFERENCES_HOST=references
|
|
||||||
SESSION_SECRET=foo
|
SESSION_SECRET=foo
|
||||||
V1_HISTORY_HOST=history-v1
|
|
||||||
WEBPACK_HOST=webpack
|
WEBPACK_HOST=webpack
|
||||||
WEB_API_PASSWORD=overleaf
|
WEB_API_PASSWORD=overleaf
|
||||||
WEB_API_USER=overleaf
|
WEB_API_USER=overleaf
|
||||||
|
|
|
@ -112,19 +112,8 @@ services:
|
||||||
- ../services/real-time/app.js:/overleaf/services/real-time/app.js
|
- ../services/real-time/app.js:/overleaf/services/real-time/app.js
|
||||||
- ../services/real-time/config:/overleaf/services/real-time/config
|
- ../services/real-time/config:/overleaf/services/real-time/config
|
||||||
|
|
||||||
references:
|
|
||||||
command: ["node", "--watch", "app.js"]
|
|
||||||
environment:
|
|
||||||
- NODE_OPTIONS=--inspect=0.0.0.0:9229
|
|
||||||
ports:
|
|
||||||
- "127.0.0.1:9238:9229"
|
|
||||||
volumes:
|
|
||||||
- ../services/references/app:/overleaf/services/references/app
|
|
||||||
- ../services/references/config:/overleaf/services/references/config
|
|
||||||
- ../services/references/app.js:/overleaf/services/references/app.js
|
|
||||||
|
|
||||||
web:
|
web:
|
||||||
command: ["node", "--watch", "app.mjs", "--watch-locales"]
|
command: ["node", "--watch", "app.js", "--watch-locales"]
|
||||||
environment:
|
environment:
|
||||||
- NODE_OPTIONS=--inspect=0.0.0.0:9229
|
- NODE_OPTIONS=--inspect=0.0.0.0:9229
|
||||||
ports:
|
ports:
|
||||||
|
|
|
@ -1,5 +1,6 @@
|
||||||
volumes:
|
volumes:
|
||||||
clsi-cache:
|
clsi-cache:
|
||||||
|
clsi-output:
|
||||||
filestore-public-files:
|
filestore-public-files:
|
||||||
filestore-template-files:
|
filestore-template-files:
|
||||||
filestore-uploads:
|
filestore-uploads:
|
||||||
|
@ -25,16 +26,15 @@ services:
|
||||||
env_file:
|
env_file:
|
||||||
- dev.env
|
- dev.env
|
||||||
environment:
|
environment:
|
||||||
|
- DOCKER_RUNNER=true
|
||||||
- TEXLIVE_IMAGE=texlive-full # docker build texlive -t texlive-full
|
- TEXLIVE_IMAGE=texlive-full # docker build texlive -t texlive-full
|
||||||
- SANDBOXED_COMPILES=true
|
- COMPILES_HOST_DIR=${PWD}/compiles
|
||||||
- SANDBOXED_COMPILES_HOST_DIR_COMPILES=${PWD}/compiles
|
|
||||||
- SANDBOXED_COMPILES_HOST_DIR_OUTPUT=${PWD}/output
|
|
||||||
user: root
|
user: root
|
||||||
volumes:
|
volumes:
|
||||||
- ${PWD}/compiles:/overleaf/services/clsi/compiles
|
- ${PWD}/compiles:/overleaf/services/clsi/compiles
|
||||||
- ${PWD}/output:/overleaf/services/clsi/output
|
|
||||||
- ${DOCKER_SOCKET_PATH:-/var/run/docker.sock}:/var/run/docker.sock
|
- ${DOCKER_SOCKET_PATH:-/var/run/docker.sock}:/var/run/docker.sock
|
||||||
- clsi-cache:/overleaf/services/clsi/cache
|
- clsi-cache:/overleaf/services/clsi/cache
|
||||||
|
- clsi-output:/overleaf/services/clsi/output
|
||||||
|
|
||||||
contacts:
|
contacts:
|
||||||
build:
|
build:
|
||||||
|
@ -123,7 +123,7 @@ services:
|
||||||
dockerfile: services/real-time/Dockerfile
|
dockerfile: services/real-time/Dockerfile
|
||||||
env_file:
|
env_file:
|
||||||
- dev.env
|
- dev.env
|
||||||
|
|
||||||
redis:
|
redis:
|
||||||
image: redis:5
|
image: redis:5
|
||||||
ports:
|
ports:
|
||||||
|
@ -131,13 +131,6 @@ services:
|
||||||
volumes:
|
volumes:
|
||||||
- redis-data:/data
|
- redis-data:/data
|
||||||
|
|
||||||
references:
|
|
||||||
build:
|
|
||||||
context: ..
|
|
||||||
dockerfile: services/references/Dockerfile
|
|
||||||
env_file:
|
|
||||||
- dev.env
|
|
||||||
|
|
||||||
web:
|
web:
|
||||||
build:
|
build:
|
||||||
context: ..
|
context: ..
|
||||||
|
@ -147,7 +140,7 @@ services:
|
||||||
- dev.env
|
- dev.env
|
||||||
environment:
|
environment:
|
||||||
- APP_NAME=Overleaf Community Edition
|
- APP_NAME=Overleaf Community Edition
|
||||||
- ENABLED_LINKED_FILE_TYPES=project_file,project_output_file,url
|
- ENABLED_LINKED_FILE_TYPES=project_file,project_output_file
|
||||||
- EMAIL_CONFIRMATION_DISABLED=true
|
- EMAIL_CONFIRMATION_DISABLED=true
|
||||||
- NODE_ENV=development
|
- NODE_ENV=development
|
||||||
- OVERLEAF_ALLOW_PUBLIC_ACCESS=true
|
- OVERLEAF_ALLOW_PUBLIC_ACCESS=true
|
||||||
|
@ -168,7 +161,6 @@ services:
|
||||||
- notifications
|
- notifications
|
||||||
- project-history
|
- project-history
|
||||||
- real-time
|
- real-time
|
||||||
- references
|
|
||||||
|
|
||||||
webpack:
|
webpack:
|
||||||
build:
|
build:
|
||||||
|
|
BIN
doc/logo.png
BIN
doc/logo.png
Binary file not shown.
Before Width: | Height: | Size: 13 KiB After Width: | Height: | Size: 71 KiB |
Binary file not shown.
Before Width: | Height: | Size: 1 MiB After Width: | Height: | Size: 587 KiB |
|
@ -32,7 +32,7 @@ services:
|
||||||
OVERLEAF_REDIS_HOST: redis
|
OVERLEAF_REDIS_HOST: redis
|
||||||
REDIS_HOST: redis
|
REDIS_HOST: redis
|
||||||
|
|
||||||
ENABLED_LINKED_FILE_TYPES: 'project_file,project_output_file,url'
|
ENABLED_LINKED_FILE_TYPES: 'project_file,project_output_file'
|
||||||
|
|
||||||
# Enables Thumbnail generation using ImageMagick
|
# Enables Thumbnail generation using ImageMagick
|
||||||
ENABLE_CONVERSIONS: 'true'
|
ENABLE_CONVERSIONS: 'true'
|
||||||
|
@ -73,19 +73,11 @@ services:
|
||||||
## Server Pro ##
|
## Server Pro ##
|
||||||
################
|
################
|
||||||
|
|
||||||
## The Community Edition is intended for use in environments where all users are trusted and is not appropriate for
|
## Sandboxed Compiles: https://github.com/overleaf/overleaf/wiki/Server-Pro:-Sandboxed-Compiles
|
||||||
## scenarios where isolation of users is required. Sandboxed Compiles are not available in the Community Edition,
|
|
||||||
## so the following environment variables must be commented out to avoid compile issues.
|
|
||||||
##
|
|
||||||
## Sandboxed Compiles: https://docs.overleaf.com/on-premises/configuration/overleaf-toolkit/server-pro-only-configuration/sandboxed-compiles
|
|
||||||
SANDBOXED_COMPILES: 'true'
|
SANDBOXED_COMPILES: 'true'
|
||||||
### Bind-mount source for /var/lib/overleaf/data/compiles inside the container.
|
|
||||||
SANDBOXED_COMPILES_HOST_DIR_COMPILES: '/home/user/sharelatex_data/data/compiles'
|
|
||||||
### Bind-mount source for /var/lib/overleaf/data/output inside the container.
|
|
||||||
SANDBOXED_COMPILES_HOST_DIR_OUTPUT: '/home/user/sharelatex_data/data/output'
|
|
||||||
### Backwards compatibility (before Server Pro 5.5)
|
|
||||||
DOCKER_RUNNER: 'true'
|
|
||||||
SANDBOXED_COMPILES_SIBLING_CONTAINERS: 'true'
|
SANDBOXED_COMPILES_SIBLING_CONTAINERS: 'true'
|
||||||
|
### Bind-mount source for /var/lib/overleaf/data/compiles inside the container.
|
||||||
|
SANDBOXED_COMPILES_HOST_DIR: '/home/user/sharelatex_data/data/compiles'
|
||||||
|
|
||||||
## Works with test LDAP server shown at bottom of docker compose
|
## Works with test LDAP server shown at bottom of docker compose
|
||||||
# OVERLEAF_LDAP_URL: 'ldap://ldap:389'
|
# OVERLEAF_LDAP_URL: 'ldap://ldap:389'
|
||||||
|
|
1
libraries/access-token-encryptor/.dockerignore
Normal file
1
libraries/access-token-encryptor/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
||||||
|
node_modules/
|
46
libraries/access-token-encryptor/.gitignore
vendored
Normal file
46
libraries/access-token-encryptor/.gitignore
vendored
Normal file
|
@ -0,0 +1,46 @@
|
||||||
|
compileFolder
|
||||||
|
|
||||||
|
Compiled source #
|
||||||
|
###################
|
||||||
|
*.com
|
||||||
|
*.class
|
||||||
|
*.dll
|
||||||
|
*.exe
|
||||||
|
*.o
|
||||||
|
*.so
|
||||||
|
|
||||||
|
# Packages #
|
||||||
|
############
|
||||||
|
# it's better to unpack these files and commit the raw source
|
||||||
|
# git has its own built in compression methods
|
||||||
|
*.7z
|
||||||
|
*.dmg
|
||||||
|
*.gz
|
||||||
|
*.iso
|
||||||
|
*.jar
|
||||||
|
*.rar
|
||||||
|
*.tar
|
||||||
|
*.zip
|
||||||
|
|
||||||
|
# Logs and databases #
|
||||||
|
######################
|
||||||
|
*.log
|
||||||
|
*.sql
|
||||||
|
*.sqlite
|
||||||
|
|
||||||
|
# OS generated files #
|
||||||
|
######################
|
||||||
|
.DS_Store?
|
||||||
|
ehthumbs.db
|
||||||
|
Icon?
|
||||||
|
Thumbs.db
|
||||||
|
|
||||||
|
/node_modules/*
|
||||||
|
data/*/*
|
||||||
|
|
||||||
|
**.swp
|
||||||
|
|
||||||
|
/log.json
|
||||||
|
hash_folder
|
||||||
|
|
||||||
|
.npmrc
|
|
@ -1 +1 @@
|
||||||
22.17.0
|
20.18.2
|
||||||
|
|
|
@ -1,10 +1,10 @@
|
||||||
access-token-encryptor
|
access-token-encryptor
|
||||||
--dependencies=None
|
--dependencies=None
|
||||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
--docker-repos=gcr.io/overleaf-ops
|
||||||
--env-add=
|
--env-add=
|
||||||
--env-pass-through=
|
--env-pass-through=
|
||||||
--esmock-loader=False
|
--esmock-loader=False
|
||||||
--is-library=True
|
--is-library=True
|
||||||
--node-version=22.17.0
|
--node-version=20.18.2
|
||||||
--public-repo=False
|
--public-repo=False
|
||||||
--script-version=4.7.0
|
--script-version=4.5.0
|
||||||
|
|
1
libraries/fetch-utils/.dockerignore
Normal file
1
libraries/fetch-utils/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
||||||
|
node_modules/
|
3
libraries/fetch-utils/.gitignore
vendored
Normal file
3
libraries/fetch-utils/.gitignore
vendored
Normal file
|
@ -0,0 +1,3 @@
|
||||||
|
|
||||||
|
# managed by monorepo$ bin/update_build_scripts
|
||||||
|
.npmrc
|
|
@ -1 +1 @@
|
||||||
22.17.0
|
20.18.2
|
||||||
|
|
|
@ -1,10 +1,10 @@
|
||||||
fetch-utils
|
fetch-utils
|
||||||
--dependencies=None
|
--dependencies=None
|
||||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
--docker-repos=gcr.io/overleaf-ops
|
||||||
--env-add=
|
--env-add=
|
||||||
--env-pass-through=
|
--env-pass-through=
|
||||||
--esmock-loader=False
|
--esmock-loader=False
|
||||||
--is-library=True
|
--is-library=True
|
||||||
--node-version=22.17.0
|
--node-version=20.18.2
|
||||||
--public-repo=False
|
--public-repo=False
|
||||||
--script-version=4.7.0
|
--script-version=4.5.0
|
||||||
|
|
|
@ -23,11 +23,11 @@ async function fetchJson(url, opts = {}) {
|
||||||
}
|
}
|
||||||
|
|
||||||
async function fetchJsonWithResponse(url, opts = {}) {
|
async function fetchJsonWithResponse(url, opts = {}) {
|
||||||
const { fetchOpts, detachSignal } = parseOpts(opts)
|
const { fetchOpts } = parseOpts(opts)
|
||||||
fetchOpts.headers = fetchOpts.headers ?? {}
|
fetchOpts.headers = fetchOpts.headers ?? {}
|
||||||
fetchOpts.headers.Accept = fetchOpts.headers.Accept ?? 'application/json'
|
fetchOpts.headers.Accept = fetchOpts.headers.Accept ?? 'application/json'
|
||||||
|
|
||||||
const response = await performRequest(url, fetchOpts, detachSignal)
|
const response = await performRequest(url, fetchOpts)
|
||||||
if (!response.ok) {
|
if (!response.ok) {
|
||||||
const body = await maybeGetResponseBody(response)
|
const body = await maybeGetResponseBody(response)
|
||||||
throw new RequestFailedError(url, opts, response, body)
|
throw new RequestFailedError(url, opts, response, body)
|
||||||
|
@ -53,8 +53,8 @@ async function fetchStream(url, opts = {}) {
|
||||||
}
|
}
|
||||||
|
|
||||||
async function fetchStreamWithResponse(url, opts = {}) {
|
async function fetchStreamWithResponse(url, opts = {}) {
|
||||||
const { fetchOpts, abortController, detachSignal } = parseOpts(opts)
|
const { fetchOpts, abortController } = parseOpts(opts)
|
||||||
const response = await performRequest(url, fetchOpts, detachSignal)
|
const response = await performRequest(url, fetchOpts)
|
||||||
|
|
||||||
if (!response.ok) {
|
if (!response.ok) {
|
||||||
const body = await maybeGetResponseBody(response)
|
const body = await maybeGetResponseBody(response)
|
||||||
|
@ -76,8 +76,8 @@ async function fetchStreamWithResponse(url, opts = {}) {
|
||||||
* @throws {RequestFailedError} if the response has a failure status code
|
* @throws {RequestFailedError} if the response has a failure status code
|
||||||
*/
|
*/
|
||||||
async function fetchNothing(url, opts = {}) {
|
async function fetchNothing(url, opts = {}) {
|
||||||
const { fetchOpts, detachSignal } = parseOpts(opts)
|
const { fetchOpts } = parseOpts(opts)
|
||||||
const response = await performRequest(url, fetchOpts, detachSignal)
|
const response = await performRequest(url, fetchOpts)
|
||||||
if (!response.ok) {
|
if (!response.ok) {
|
||||||
const body = await maybeGetResponseBody(response)
|
const body = await maybeGetResponseBody(response)
|
||||||
throw new RequestFailedError(url, opts, response, body)
|
throw new RequestFailedError(url, opts, response, body)
|
||||||
|
@ -108,9 +108,9 @@ async function fetchRedirect(url, opts = {}) {
|
||||||
* @throws {RequestFailedError} if the response has a non redirect status code or missing Location header
|
* @throws {RequestFailedError} if the response has a non redirect status code or missing Location header
|
||||||
*/
|
*/
|
||||||
async function fetchRedirectWithResponse(url, opts = {}) {
|
async function fetchRedirectWithResponse(url, opts = {}) {
|
||||||
const { fetchOpts, detachSignal } = parseOpts(opts)
|
const { fetchOpts } = parseOpts(opts)
|
||||||
fetchOpts.redirect = 'manual'
|
fetchOpts.redirect = 'manual'
|
||||||
const response = await performRequest(url, fetchOpts, detachSignal)
|
const response = await performRequest(url, fetchOpts)
|
||||||
if (response.status < 300 || response.status >= 400) {
|
if (response.status < 300 || response.status >= 400) {
|
||||||
const body = await maybeGetResponseBody(response)
|
const body = await maybeGetResponseBody(response)
|
||||||
throw new RequestFailedError(url, opts, response, body)
|
throw new RequestFailedError(url, opts, response, body)
|
||||||
|
@ -142,8 +142,8 @@ async function fetchString(url, opts = {}) {
|
||||||
}
|
}
|
||||||
|
|
||||||
async function fetchStringWithResponse(url, opts = {}) {
|
async function fetchStringWithResponse(url, opts = {}) {
|
||||||
const { fetchOpts, detachSignal } = parseOpts(opts)
|
const { fetchOpts } = parseOpts(opts)
|
||||||
const response = await performRequest(url, fetchOpts, detachSignal)
|
const response = await performRequest(url, fetchOpts)
|
||||||
if (!response.ok) {
|
if (!response.ok) {
|
||||||
const body = await maybeGetResponseBody(response)
|
const body = await maybeGetResponseBody(response)
|
||||||
throw new RequestFailedError(url, opts, response, body)
|
throw new RequestFailedError(url, opts, response, body)
|
||||||
|
@ -178,14 +178,13 @@ function parseOpts(opts) {
|
||||||
|
|
||||||
const abortController = new AbortController()
|
const abortController = new AbortController()
|
||||||
fetchOpts.signal = abortController.signal
|
fetchOpts.signal = abortController.signal
|
||||||
let detachSignal = () => {}
|
|
||||||
if (opts.signal) {
|
if (opts.signal) {
|
||||||
detachSignal = abortOnSignal(abortController, opts.signal)
|
abortOnSignal(abortController, opts.signal)
|
||||||
}
|
}
|
||||||
if (opts.body instanceof Readable) {
|
if (opts.body instanceof Readable) {
|
||||||
abortOnDestroyedRequest(abortController, fetchOpts.body)
|
abortOnDestroyedRequest(abortController, fetchOpts.body)
|
||||||
}
|
}
|
||||||
return { fetchOpts, abortController, detachSignal }
|
return { fetchOpts, abortController }
|
||||||
}
|
}
|
||||||
|
|
||||||
function setupJsonBody(fetchOpts, json) {
|
function setupJsonBody(fetchOpts, json) {
|
||||||
|
@ -209,9 +208,6 @@ function abortOnSignal(abortController, signal) {
|
||||||
abortController.abort(signal.reason)
|
abortController.abort(signal.reason)
|
||||||
}
|
}
|
||||||
signal.addEventListener('abort', listener)
|
signal.addEventListener('abort', listener)
|
||||||
return () => {
|
|
||||||
signal.removeEventListener('abort', listener)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
function abortOnDestroyedRequest(abortController, stream) {
|
function abortOnDestroyedRequest(abortController, stream) {
|
||||||
|
@ -230,12 +226,11 @@ function abortOnDestroyedResponse(abortController, response) {
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
async function performRequest(url, fetchOpts, detachSignal) {
|
async function performRequest(url, fetchOpts) {
|
||||||
let response
|
let response
|
||||||
try {
|
try {
|
||||||
response = await fetch(url, fetchOpts)
|
response = await fetch(url, fetchOpts)
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
detachSignal()
|
|
||||||
if (fetchOpts.body instanceof Readable) {
|
if (fetchOpts.body instanceof Readable) {
|
||||||
fetchOpts.body.destroy()
|
fetchOpts.body.destroy()
|
||||||
}
|
}
|
||||||
|
@ -244,7 +239,6 @@ async function performRequest(url, fetchOpts, detachSignal) {
|
||||||
method: fetchOpts.method ?? 'GET',
|
method: fetchOpts.method ?? 'GET',
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
response.body.on('close', detachSignal)
|
|
||||||
if (fetchOpts.body instanceof Readable) {
|
if (fetchOpts.body instanceof Readable) {
|
||||||
response.body.on('close', () => {
|
response.body.on('close', () => {
|
||||||
if (!fetchOpts.body.readableEnded) {
|
if (!fetchOpts.body.readableEnded) {
|
||||||
|
|
|
@ -1,9 +1,6 @@
|
||||||
const { expect } = require('chai')
|
const { expect } = require('chai')
|
||||||
const fs = require('node:fs')
|
|
||||||
const events = require('node:events')
|
|
||||||
const { FetchError, AbortError } = require('node-fetch')
|
const { FetchError, AbortError } = require('node-fetch')
|
||||||
const { Readable } = require('node:stream')
|
const { Readable } = require('node:stream')
|
||||||
const { pipeline } = require('node:stream/promises')
|
|
||||||
const { once } = require('node:events')
|
const { once } = require('node:events')
|
||||||
const { TestServer } = require('./helpers/TestServer')
|
const { TestServer } = require('./helpers/TestServer')
|
||||||
const selfsigned = require('selfsigned')
|
const selfsigned = require('selfsigned')
|
||||||
|
@ -206,31 +203,6 @@ describe('fetch-utils', function () {
|
||||||
).to.be.rejectedWith(AbortError)
|
).to.be.rejectedWith(AbortError)
|
||||||
expect(stream.destroyed).to.be.true
|
expect(stream.destroyed).to.be.true
|
||||||
})
|
})
|
||||||
|
|
||||||
it('detaches from signal on success', async function () {
|
|
||||||
const signal = AbortSignal.timeout(10_000)
|
|
||||||
for (let i = 0; i < 20; i++) {
|
|
||||||
const s = await fetchStream(this.url('/hello'), { signal })
|
|
||||||
expect(events.getEventListeners(signal, 'abort')).to.have.length(1)
|
|
||||||
await pipeline(s, fs.createWriteStream('/dev/null'))
|
|
||||||
expect(events.getEventListeners(signal, 'abort')).to.have.length(0)
|
|
||||||
}
|
|
||||||
})
|
|
||||||
|
|
||||||
it('detaches from signal on error', async function () {
|
|
||||||
const signal = AbortSignal.timeout(10_000)
|
|
||||||
for (let i = 0; i < 20; i++) {
|
|
||||||
try {
|
|
||||||
await fetchStream(this.url('/500'), { signal })
|
|
||||||
} catch (err) {
|
|
||||||
if (err instanceof RequestFailedError && err.response.status === 500)
|
|
||||||
continue
|
|
||||||
throw err
|
|
||||||
} finally {
|
|
||||||
expect(events.getEventListeners(signal, 'abort')).to.have.length(0)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
})
|
|
||||||
})
|
})
|
||||||
|
|
||||||
describe('fetchNothing', function () {
|
describe('fetchNothing', function () {
|
||||||
|
@ -419,16 +391,9 @@ async function* infiniteIterator() {
|
||||||
async function abortOnceReceived(func, server) {
|
async function abortOnceReceived(func, server) {
|
||||||
const controller = new AbortController()
|
const controller = new AbortController()
|
||||||
const promise = func(controller.signal)
|
const promise = func(controller.signal)
|
||||||
expect(events.getEventListeners(controller.signal, 'abort')).to.have.length(1)
|
|
||||||
await once(server.events, 'request-received')
|
await once(server.events, 'request-received')
|
||||||
controller.abort()
|
controller.abort()
|
||||||
try {
|
return await promise
|
||||||
return await promise
|
|
||||||
} finally {
|
|
||||||
expect(events.getEventListeners(controller.signal, 'abort')).to.have.length(
|
|
||||||
0
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
async function expectRequestAborted(req) {
|
async function expectRequestAborted(req) {
|
||||||
|
|
1
libraries/logger/.dockerignore
Normal file
1
libraries/logger/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
||||||
|
node_modules/
|
3
libraries/logger/.gitignore
vendored
Normal file
3
libraries/logger/.gitignore
vendored
Normal file
|
@ -0,0 +1,3 @@
|
||||||
|
node_modules
|
||||||
|
|
||||||
|
.npmrc
|
|
@ -1 +1 @@
|
||||||
22.17.0
|
20.18.2
|
||||||
|
|
|
@ -1,10 +1,10 @@
|
||||||
logger
|
logger
|
||||||
--dependencies=None
|
--dependencies=None
|
||||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
--docker-repos=gcr.io/overleaf-ops
|
||||||
--env-add=
|
--env-add=
|
||||||
--env-pass-through=
|
--env-pass-through=
|
||||||
--esmock-loader=False
|
--esmock-loader=False
|
||||||
--is-library=True
|
--is-library=True
|
||||||
--node-version=22.17.0
|
--node-version=20.18.2
|
||||||
--public-repo=False
|
--public-repo=False
|
||||||
--script-version=4.7.0
|
--script-version=4.5.0
|
||||||
|
|
1
libraries/metrics/.dockerignore
Normal file
1
libraries/metrics/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
||||||
|
node_modules/
|
3
libraries/metrics/.gitignore
vendored
Normal file
3
libraries/metrics/.gitignore
vendored
Normal file
|
@ -0,0 +1,3 @@
|
||||||
|
node_modules
|
||||||
|
|
||||||
|
.npmrc
|
|
@ -1 +1 @@
|
||||||
22.17.0
|
20.18.2
|
||||||
|
|
|
@ -1,10 +1,10 @@
|
||||||
metrics
|
metrics
|
||||||
--dependencies=None
|
--dependencies=None
|
||||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
--docker-repos=gcr.io/overleaf-ops
|
||||||
--env-add=
|
--env-add=
|
||||||
--env-pass-through=
|
--env-pass-through=
|
||||||
--esmock-loader=False
|
--esmock-loader=False
|
||||||
--is-library=True
|
--is-library=True
|
||||||
--node-version=22.17.0
|
--node-version=20.18.2
|
||||||
--public-repo=False
|
--public-repo=False
|
||||||
--script-version=4.7.0
|
--script-version=4.5.0
|
||||||
|
|
|
@ -5,8 +5,6 @@
|
||||||
* before any other module to support code instrumentation.
|
* before any other module to support code instrumentation.
|
||||||
*/
|
*/
|
||||||
|
|
||||||
const metricsModuleImportStartTime = performance.now()
|
|
||||||
|
|
||||||
const APP_NAME = process.env.METRICS_APP_NAME || 'unknown'
|
const APP_NAME = process.env.METRICS_APP_NAME || 'unknown'
|
||||||
const BUILD_VERSION = process.env.BUILD_VERSION
|
const BUILD_VERSION = process.env.BUILD_VERSION
|
||||||
const ENABLE_PROFILE_AGENT = process.env.ENABLE_PROFILE_AGENT === 'true'
|
const ENABLE_PROFILE_AGENT = process.env.ENABLE_PROFILE_AGENT === 'true'
|
||||||
|
@ -105,5 +103,3 @@ function recordProcessStart() {
|
||||||
const metrics = require('.')
|
const metrics = require('.')
|
||||||
metrics.inc('process_startup')
|
metrics.inc('process_startup')
|
||||||
}
|
}
|
||||||
|
|
||||||
module.exports = { metricsModuleImportStartTime }
|
|
||||||
|
|
|
@ -9,7 +9,7 @@
|
||||||
"main": "index.js",
|
"main": "index.js",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@google-cloud/opentelemetry-cloud-trace-exporter": "^2.1.0",
|
"@google-cloud/opentelemetry-cloud-trace-exporter": "^2.1.0",
|
||||||
"@google-cloud/profiler": "^6.0.3",
|
"@google-cloud/profiler": "^6.0.0",
|
||||||
"@opentelemetry/api": "^1.4.1",
|
"@opentelemetry/api": "^1.4.1",
|
||||||
"@opentelemetry/auto-instrumentations-node": "^0.39.1",
|
"@opentelemetry/auto-instrumentations-node": "^0.39.1",
|
||||||
"@opentelemetry/exporter-trace-otlp-http": "^0.41.2",
|
"@opentelemetry/exporter-trace-otlp-http": "^0.41.2",
|
||||||
|
|
1
libraries/mongo-utils/.dockerignore
Normal file
1
libraries/mongo-utils/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
||||||
|
node_modules/
|
3
libraries/mongo-utils/.gitignore
vendored
Normal file
3
libraries/mongo-utils/.gitignore
vendored
Normal file
|
@ -0,0 +1,3 @@
|
||||||
|
|
||||||
|
# managed by monorepo$ bin/update_build_scripts
|
||||||
|
.npmrc
|
|
@ -1 +1 @@
|
||||||
22.17.0
|
20.18.2
|
||||||
|
|
|
@ -16,7 +16,6 @@ let VERBOSE_LOGGING
|
||||||
let BATCH_RANGE_START
|
let BATCH_RANGE_START
|
||||||
let BATCH_RANGE_END
|
let BATCH_RANGE_END
|
||||||
let BATCH_MAX_TIME_SPAN_IN_MS
|
let BATCH_MAX_TIME_SPAN_IN_MS
|
||||||
let BATCHED_UPDATE_RUNNING = false
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @typedef {import("mongodb").Collection} Collection
|
* @typedef {import("mongodb").Collection} Collection
|
||||||
|
@ -35,7 +34,6 @@ let BATCHED_UPDATE_RUNNING = false
|
||||||
* @property {string} [BATCH_RANGE_START]
|
* @property {string} [BATCH_RANGE_START]
|
||||||
* @property {string} [BATCH_SIZE]
|
* @property {string} [BATCH_SIZE]
|
||||||
* @property {string} [VERBOSE_LOGGING]
|
* @property {string} [VERBOSE_LOGGING]
|
||||||
* @property {(progress: string) => Promise<void>} [trackProgress]
|
|
||||||
*/
|
*/
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
@ -211,71 +209,59 @@ async function batchedUpdate(
|
||||||
update,
|
update,
|
||||||
projection,
|
projection,
|
||||||
findOptions,
|
findOptions,
|
||||||
batchedUpdateOptions = {}
|
batchedUpdateOptions
|
||||||
) {
|
) {
|
||||||
// only a single batchedUpdate can run at a time due to global variables
|
ID_EDGE_PAST = await getIdEdgePast(collection)
|
||||||
if (BATCHED_UPDATE_RUNNING) {
|
if (!ID_EDGE_PAST) {
|
||||||
throw new Error('batchedUpdate is already running')
|
console.warn(
|
||||||
|
`The collection ${collection.collectionName} appears to be empty.`
|
||||||
|
)
|
||||||
|
return 0
|
||||||
}
|
}
|
||||||
try {
|
refreshGlobalOptionsForBatchedUpdate(batchedUpdateOptions)
|
||||||
BATCHED_UPDATE_RUNNING = true
|
|
||||||
ID_EDGE_PAST = await getIdEdgePast(collection)
|
|
||||||
if (!ID_EDGE_PAST) {
|
|
||||||
console.warn(
|
|
||||||
`The collection ${collection.collectionName} appears to be empty.`
|
|
||||||
)
|
|
||||||
return 0
|
|
||||||
}
|
|
||||||
refreshGlobalOptionsForBatchedUpdate(batchedUpdateOptions)
|
|
||||||
const { trackProgress = async progress => console.warn(progress) } =
|
|
||||||
batchedUpdateOptions
|
|
||||||
|
|
||||||
findOptions = findOptions || {}
|
findOptions = findOptions || {}
|
||||||
findOptions.readPreference = READ_PREFERENCE_SECONDARY
|
findOptions.readPreference = READ_PREFERENCE_SECONDARY
|
||||||
|
|
||||||
projection = projection || { _id: 1 }
|
projection = projection || { _id: 1 }
|
||||||
let nextBatch
|
let nextBatch
|
||||||
let updated = 0
|
let updated = 0
|
||||||
let start = BATCH_RANGE_START
|
let start = BATCH_RANGE_START
|
||||||
|
|
||||||
while (start !== BATCH_RANGE_END) {
|
while (start !== BATCH_RANGE_END) {
|
||||||
let end = getNextEnd(start)
|
let end = getNextEnd(start)
|
||||||
nextBatch = await getNextBatch(
|
nextBatch = await getNextBatch(
|
||||||
collection,
|
collection,
|
||||||
query,
|
query,
|
||||||
start,
|
start,
|
||||||
end,
|
end,
|
||||||
projection,
|
projection,
|
||||||
findOptions
|
findOptions
|
||||||
)
|
)
|
||||||
if (nextBatch.length > 0) {
|
if (nextBatch.length > 0) {
|
||||||
end = nextBatch[nextBatch.length - 1]._id
|
end = nextBatch[nextBatch.length - 1]._id
|
||||||
updated += nextBatch.length
|
updated += nextBatch.length
|
||||||
|
|
||||||
if (VERBOSE_LOGGING) {
|
if (VERBOSE_LOGGING) {
|
||||||
console.log(
|
console.log(
|
||||||
`Running update on batch with ids ${JSON.stringify(
|
`Running update on batch with ids ${JSON.stringify(
|
||||||
nextBatch.map(entry => entry._id)
|
nextBatch.map(entry => entry._id)
|
||||||
)}`
|
)}`
|
||||||
)
|
|
||||||
}
|
|
||||||
await trackProgress(
|
|
||||||
`Running update on batch ending ${renderObjectId(end)}`
|
|
||||||
)
|
)
|
||||||
|
} else {
|
||||||
if (typeof update === 'function') {
|
console.error(`Running update on batch ending ${renderObjectId(end)}`)
|
||||||
await update(nextBatch)
|
}
|
||||||
} else {
|
|
||||||
await performUpdate(collection, nextBatch, update)
|
if (typeof update === 'function') {
|
||||||
}
|
await update(nextBatch)
|
||||||
|
} else {
|
||||||
|
await performUpdate(collection, nextBatch, update)
|
||||||
}
|
}
|
||||||
await trackProgress(`Completed batch ending ${renderObjectId(end)}`)
|
|
||||||
start = end
|
|
||||||
}
|
}
|
||||||
return updated
|
console.error(`Completed batch ending ${renderObjectId(end)}`)
|
||||||
} finally {
|
start = end
|
||||||
BATCHED_UPDATE_RUNNING = false
|
|
||||||
}
|
}
|
||||||
|
return updated
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|
|
@ -1,10 +1,10 @@
|
||||||
mongo-utils
|
mongo-utils
|
||||||
--dependencies=None
|
--dependencies=None
|
||||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
--docker-repos=gcr.io/overleaf-ops
|
||||||
--env-add=
|
--env-add=
|
||||||
--env-pass-through=
|
--env-pass-through=
|
||||||
--esmock-loader=False
|
--esmock-loader=False
|
||||||
--is-library=True
|
--is-library=True
|
||||||
--node-version=22.17.0
|
--node-version=20.18.2
|
||||||
--public-repo=False
|
--public-repo=False
|
||||||
--script-version=4.7.0
|
--script-version=4.5.0
|
||||||
|
|
1
libraries/o-error/.dockerignore
Normal file
1
libraries/o-error/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
||||||
|
node_modules/
|
5
libraries/o-error/.gitignore
vendored
Normal file
5
libraries/o-error/.gitignore
vendored
Normal file
|
@ -0,0 +1,5 @@
|
||||||
|
.nyc_output
|
||||||
|
coverage
|
||||||
|
node_modules/
|
||||||
|
|
||||||
|
.npmrc
|
|
@ -1 +1 @@
|
||||||
22.17.0
|
20.18.2
|
||||||
|
|
|
@ -1,10 +1,10 @@
|
||||||
o-error
|
o-error
|
||||||
--dependencies=None
|
--dependencies=None
|
||||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
--docker-repos=gcr.io/overleaf-ops
|
||||||
--env-add=
|
--env-add=
|
||||||
--env-pass-through=
|
--env-pass-through=
|
||||||
--esmock-loader=False
|
--esmock-loader=False
|
||||||
--is-library=True
|
--is-library=True
|
||||||
--node-version=22.17.0
|
--node-version=20.18.2
|
||||||
--public-repo=False
|
--public-repo=False
|
||||||
--script-version=4.7.0
|
--script-version=4.5.0
|
||||||
|
|
|
@ -1,34 +1,20 @@
|
||||||
// @ts-check
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Light-weight helpers for handling JavaScript Errors in node.js and the
|
* Light-weight helpers for handling JavaScript Errors in node.js and the
|
||||||
* browser.
|
* browser.
|
||||||
*/
|
*/
|
||||||
class OError extends Error {
|
class OError extends Error {
|
||||||
/**
|
|
||||||
* The error that is the underlying cause of this error
|
|
||||||
*
|
|
||||||
* @type {unknown}
|
|
||||||
*/
|
|
||||||
cause
|
|
||||||
|
|
||||||
/**
|
|
||||||
* List of errors encountered as the callback chain is unwound
|
|
||||||
*
|
|
||||||
* @type {TaggedError[] | undefined}
|
|
||||||
*/
|
|
||||||
_oErrorTags
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @param {string} message as for built-in Error
|
* @param {string} message as for built-in Error
|
||||||
* @param {Object} [info] extra data to attach to the error
|
* @param {Object} [info] extra data to attach to the error
|
||||||
* @param {unknown} [cause] the internal error that caused this error
|
* @param {Error} [cause] the internal error that caused this error
|
||||||
*/
|
*/
|
||||||
constructor(message, info, cause) {
|
constructor(message, info, cause) {
|
||||||
super(message)
|
super(message)
|
||||||
this.name = this.constructor.name
|
this.name = this.constructor.name
|
||||||
if (info) this.info = info
|
if (info) this.info = info
|
||||||
if (cause) this.cause = cause
|
if (cause) this.cause = cause
|
||||||
|
/** @private @type {Array<TaggedError> | undefined} */
|
||||||
|
this._oErrorTags // eslint-disable-line
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
@ -45,7 +31,7 @@ class OError extends Error {
|
||||||
/**
|
/**
|
||||||
* Wrap the given error, which caused this error.
|
* Wrap the given error, which caused this error.
|
||||||
*
|
*
|
||||||
* @param {unknown} cause the internal error that caused this error
|
* @param {Error} cause the internal error that caused this error
|
||||||
* @return {this}
|
* @return {this}
|
||||||
*/
|
*/
|
||||||
withCause(cause) {
|
withCause(cause) {
|
||||||
|
@ -79,16 +65,13 @@ class OError extends Error {
|
||||||
* }
|
* }
|
||||||
* }
|
* }
|
||||||
*
|
*
|
||||||
* @template {unknown} E
|
* @param {Error} error the error to tag
|
||||||
* @param {E} error the error to tag
|
|
||||||
* @param {string} [message] message with which to tag `error`
|
* @param {string} [message] message with which to tag `error`
|
||||||
* @param {Object} [info] extra data with wich to tag `error`
|
* @param {Object} [info] extra data with wich to tag `error`
|
||||||
* @return {E} the modified `error` argument
|
* @return {Error} the modified `error` argument
|
||||||
*/
|
*/
|
||||||
static tag(error, message, info) {
|
static tag(error, message, info) {
|
||||||
const oError = /** @type {{ _oErrorTags: TaggedError[] | undefined }} */ (
|
const oError = /** @type{OError} */ (error)
|
||||||
error
|
|
||||||
)
|
|
||||||
|
|
||||||
if (!oError._oErrorTags) oError._oErrorTags = []
|
if (!oError._oErrorTags) oError._oErrorTags = []
|
||||||
|
|
||||||
|
@ -119,7 +102,7 @@ class OError extends Error {
|
||||||
*
|
*
|
||||||
* If an info property is repeated, the last one wins.
|
* If an info property is repeated, the last one wins.
|
||||||
*
|
*
|
||||||
* @param {unknown} error any error (may or may not be an `OError`)
|
* @param {Error | null | undefined} error any error (may or may not be an `OError`)
|
||||||
* @return {Object}
|
* @return {Object}
|
||||||
*/
|
*/
|
||||||
static getFullInfo(error) {
|
static getFullInfo(error) {
|
||||||
|
@ -146,7 +129,7 @@ class OError extends Error {
|
||||||
* Return the `stack` property from `error`, including the `stack`s for any
|
* Return the `stack` property from `error`, including the `stack`s for any
|
||||||
* tagged errors added with `OError.tag` and for any `cause`s.
|
* tagged errors added with `OError.tag` and for any `cause`s.
|
||||||
*
|
*
|
||||||
* @param {unknown} error any error (may or may not be an `OError`)
|
* @param {Error | null | undefined} error any error (may or may not be an `OError`)
|
||||||
* @return {string}
|
* @return {string}
|
||||||
*/
|
*/
|
||||||
static getFullStack(error) {
|
static getFullStack(error) {
|
||||||
|
@ -160,7 +143,7 @@ class OError extends Error {
|
||||||
stack += `\n${oError._oErrorTags.map(tag => tag.stack).join('\n')}`
|
stack += `\n${oError._oErrorTags.map(tag => tag.stack).join('\n')}`
|
||||||
}
|
}
|
||||||
|
|
||||||
const causeStack = OError.getFullStack(oError.cause)
|
const causeStack = oError.cause && OError.getFullStack(oError.cause)
|
||||||
if (causeStack) {
|
if (causeStack) {
|
||||||
stack += '\ncaused by:\n' + indent(causeStack)
|
stack += '\ncaused by:\n' + indent(causeStack)
|
||||||
}
|
}
|
||||||
|
|
|
@ -268,11 +268,6 @@ describe('utils', function () {
|
||||||
expect(OError.getFullInfo(null)).to.deep.equal({})
|
expect(OError.getFullInfo(null)).to.deep.equal({})
|
||||||
})
|
})
|
||||||
|
|
||||||
it('works when given a string', function () {
|
|
||||||
const err = 'not an error instance'
|
|
||||||
expect(OError.getFullInfo(err)).to.deep.equal({})
|
|
||||||
})
|
|
||||||
|
|
||||||
it('works on a normal error', function () {
|
it('works on a normal error', function () {
|
||||||
const err = new Error('foo')
|
const err = new Error('foo')
|
||||||
expect(OError.getFullInfo(err)).to.deep.equal({})
|
expect(OError.getFullInfo(err)).to.deep.equal({})
|
||||||
|
|
|
@ -35,14 +35,6 @@ describe('OError', function () {
|
||||||
expect(err2.cause.message).to.equal('cause 2')
|
expect(err2.cause.message).to.equal('cause 2')
|
||||||
})
|
})
|
||||||
|
|
||||||
it('accepts non-Error causes', function () {
|
|
||||||
const err1 = new OError('foo', {}, 'not-an-error')
|
|
||||||
expect(err1.cause).to.equal('not-an-error')
|
|
||||||
|
|
||||||
const err2 = new OError('foo').withCause('not-an-error')
|
|
||||||
expect(err2.cause).to.equal('not-an-error')
|
|
||||||
})
|
|
||||||
|
|
||||||
it('handles a custom error type with a cause', function () {
|
it('handles a custom error type with a cause', function () {
|
||||||
function doSomethingBadInternally() {
|
function doSomethingBadInternally() {
|
||||||
throw new Error('internal error')
|
throw new Error('internal error')
|
||||||
|
|
1
libraries/object-persistor/.dockerignore
Normal file
1
libraries/object-persistor/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
||||||
|
node_modules/
|
4
libraries/object-persistor/.gitignore
vendored
Normal file
4
libraries/object-persistor/.gitignore
vendored
Normal file
|
@ -0,0 +1,4 @@
|
||||||
|
/node_modules
|
||||||
|
*.swp
|
||||||
|
|
||||||
|
.npmrc
|
|
@ -1 +1 @@
|
||||||
22.17.0
|
20.18.2
|
||||||
|
|
|
@ -1,10 +1,10 @@
|
||||||
object-persistor
|
object-persistor
|
||||||
--dependencies=None
|
--dependencies=None
|
||||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
--docker-repos=gcr.io/overleaf-ops
|
||||||
--env-add=
|
--env-add=
|
||||||
--env-pass-through=
|
--env-pass-through=
|
||||||
--esmock-loader=False
|
--esmock-loader=False
|
||||||
--is-library=True
|
--is-library=True
|
||||||
--node-version=22.17.0
|
--node-version=20.18.2
|
||||||
--public-repo=False
|
--public-repo=False
|
||||||
--script-version=4.7.0
|
--script-version=4.5.0
|
||||||
|
|
|
@ -305,10 +305,8 @@ module.exports = class FSPersistor extends AbstractPersistor {
|
||||||
|
|
||||||
async _listDirectory(path) {
|
async _listDirectory(path) {
|
||||||
if (this.useSubdirectories) {
|
if (this.useSubdirectories) {
|
||||||
// eslint-disable-next-line @typescript-eslint/return-await
|
|
||||||
return await glob(Path.join(path, '**'))
|
return await glob(Path.join(path, '**'))
|
||||||
} else {
|
} else {
|
||||||
// eslint-disable-next-line @typescript-eslint/return-await
|
|
||||||
return await glob(`${path}_*`)
|
return await glob(`${path}_*`)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -33,10 +33,6 @@ const AES256_KEY_LENGTH = 32
|
||||||
* @property {() => Promise<Array<RootKeyEncryptionKey>>} getRootKeyEncryptionKeys
|
* @property {() => Promise<Array<RootKeyEncryptionKey>>} getRootKeyEncryptionKeys
|
||||||
*/
|
*/
|
||||||
|
|
||||||
/**
|
|
||||||
* @typedef {import('./types').ListDirectoryResult} ListDirectoryResult
|
|
||||||
*/
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Helper function to make TS happy when accessing error properties
|
* Helper function to make TS happy when accessing error properties
|
||||||
* AWSError is not an actual class, so we cannot use instanceof.
|
* AWSError is not an actual class, so we cannot use instanceof.
|
||||||
|
@ -395,9 +391,9 @@ class PerProjectEncryptedS3Persistor extends S3Persistor {
|
||||||
* A general "cache" for project keys is another alternative. For now, use a helper class.
|
* A general "cache" for project keys is another alternative. For now, use a helper class.
|
||||||
*/
|
*/
|
||||||
class CachedPerProjectEncryptedS3Persistor {
|
class CachedPerProjectEncryptedS3Persistor {
|
||||||
/** @type SSECOptions */
|
/** @type SSECOptions */
|
||||||
#projectKeyOptions
|
#projectKeyOptions
|
||||||
/** @type PerProjectEncryptedS3Persistor */
|
/** @type PerProjectEncryptedS3Persistor */
|
||||||
#parent
|
#parent
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
@ -428,16 +424,6 @@ class CachedPerProjectEncryptedS3Persistor {
|
||||||
return await this.#parent.getObjectSize(bucketName, path)
|
return await this.#parent.getObjectSize(bucketName, path)
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
|
||||||
*
|
|
||||||
* @param {string} bucketName
|
|
||||||
* @param {string} path
|
|
||||||
* @return {Promise<ListDirectoryResult>}
|
|
||||||
*/
|
|
||||||
async listDirectory(bucketName, path) {
|
|
||||||
return await this.#parent.listDirectory(bucketName, path)
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @param {string} bucketName
|
* @param {string} bucketName
|
||||||
* @param {string} path
|
* @param {string} path
|
||||||
|
|
|
@ -20,18 +20,6 @@ const { URL } = require('node:url')
|
||||||
const { WriteError, ReadError, NotFoundError } = require('./Errors')
|
const { WriteError, ReadError, NotFoundError } = require('./Errors')
|
||||||
const zlib = require('node:zlib')
|
const zlib = require('node:zlib')
|
||||||
|
|
||||||
/**
|
|
||||||
* @typedef {import('aws-sdk/clients/s3').ListObjectsV2Output} ListObjectsV2Output
|
|
||||||
*/
|
|
||||||
|
|
||||||
/**
|
|
||||||
* @typedef {import('aws-sdk/clients/s3').Object} S3Object
|
|
||||||
*/
|
|
||||||
|
|
||||||
/**
|
|
||||||
* @typedef {import('./types').ListDirectoryResult} ListDirectoryResult
|
|
||||||
*/
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Wrapper with private fields to avoid revealing them on console, JSON.stringify or similar.
|
* Wrapper with private fields to avoid revealing them on console, JSON.stringify or similar.
|
||||||
*/
|
*/
|
||||||
|
@ -278,12 +266,26 @@ class S3Persistor extends AbstractPersistor {
|
||||||
* @return {Promise<void>}
|
* @return {Promise<void>}
|
||||||
*/
|
*/
|
||||||
async deleteDirectory(bucketName, key, continuationToken) {
|
async deleteDirectory(bucketName, key, continuationToken) {
|
||||||
const { contents, response } = await this.listDirectory(
|
let response
|
||||||
bucketName,
|
const options = { Bucket: bucketName, Prefix: key }
|
||||||
key,
|
if (continuationToken) {
|
||||||
continuationToken
|
options.ContinuationToken = continuationToken
|
||||||
)
|
}
|
||||||
const objects = contents.map(item => ({ Key: item.Key || '' }))
|
|
||||||
|
try {
|
||||||
|
response = await this._getClientForBucket(bucketName)
|
||||||
|
.listObjectsV2(options)
|
||||||
|
.promise()
|
||||||
|
} catch (err) {
|
||||||
|
throw PersistorHelper.wrapError(
|
||||||
|
err,
|
||||||
|
'failed to list objects in S3',
|
||||||
|
{ bucketName, key },
|
||||||
|
ReadError
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
const objects = response.Contents?.map(item => ({ Key: item.Key || '' }))
|
||||||
if (objects?.length) {
|
if (objects?.length) {
|
||||||
try {
|
try {
|
||||||
await this._getClientForBucket(bucketName)
|
await this._getClientForBucket(bucketName)
|
||||||
|
@ -314,36 +316,6 @@ class S3Persistor extends AbstractPersistor {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
|
||||||
*
|
|
||||||
* @param {string} bucketName
|
|
||||||
* @param {string} key
|
|
||||||
* @param {string} [continuationToken]
|
|
||||||
* @return {Promise<ListDirectoryResult>}
|
|
||||||
*/
|
|
||||||
async listDirectory(bucketName, key, continuationToken) {
|
|
||||||
let response
|
|
||||||
const options = { Bucket: bucketName, Prefix: key }
|
|
||||||
if (continuationToken) {
|
|
||||||
options.ContinuationToken = continuationToken
|
|
||||||
}
|
|
||||||
|
|
||||||
try {
|
|
||||||
response = await this._getClientForBucket(bucketName)
|
|
||||||
.listObjectsV2(options)
|
|
||||||
.promise()
|
|
||||||
} catch (err) {
|
|
||||||
throw PersistorHelper.wrapError(
|
|
||||||
err,
|
|
||||||
'failed to list objects in S3',
|
|
||||||
{ bucketName, key },
|
|
||||||
ReadError
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
return { contents: response.Contents ?? [], response }
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @param {string} bucketName
|
* @param {string} bucketName
|
||||||
* @param {string} key
|
* @param {string} key
|
||||||
|
|
6
libraries/object-persistor/src/types.d.ts
vendored
6
libraries/object-persistor/src/types.d.ts
vendored
|
@ -1,6 +0,0 @@
|
||||||
import type { ListObjectsV2Output, Object } from 'aws-sdk/clients/s3'
|
|
||||||
|
|
||||||
export type ListDirectoryResult = {
|
|
||||||
contents: Array<Object>
|
|
||||||
response: ListObjectsV2Output
|
|
||||||
}
|
|
1
libraries/overleaf-editor-core/.dockerignore
Normal file
1
libraries/overleaf-editor-core/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
||||||
|
node_modules/
|
5
libraries/overleaf-editor-core/.gitignore
vendored
Normal file
5
libraries/overleaf-editor-core/.gitignore
vendored
Normal file
|
@ -0,0 +1,5 @@
|
||||||
|
/coverage
|
||||||
|
/node_modules
|
||||||
|
|
||||||
|
# managed by monorepo$ bin/update_build_scripts
|
||||||
|
.npmrc
|
|
@ -1 +1 @@
|
||||||
22.17.0
|
20.18.2
|
||||||
|
|
|
@ -1,10 +1,10 @@
|
||||||
overleaf-editor-core
|
overleaf-editor-core
|
||||||
--dependencies=None
|
--dependencies=None
|
||||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
--docker-repos=gcr.io/overleaf-ops
|
||||||
--env-add=
|
--env-add=
|
||||||
--env-pass-through=
|
--env-pass-through=
|
||||||
--esmock-loader=False
|
--esmock-loader=False
|
||||||
--is-library=True
|
--is-library=True
|
||||||
--node-version=22.17.0
|
--node-version=20.18.2
|
||||||
--public-repo=False
|
--public-repo=False
|
||||||
--script-version=4.7.0
|
--script-version=4.5.0
|
||||||
|
|
|
@ -18,7 +18,6 @@ const MoveFileOperation = require('./lib/operation/move_file_operation')
|
||||||
const SetCommentStateOperation = require('./lib/operation/set_comment_state_operation')
|
const SetCommentStateOperation = require('./lib/operation/set_comment_state_operation')
|
||||||
const EditFileOperation = require('./lib/operation/edit_file_operation')
|
const EditFileOperation = require('./lib/operation/edit_file_operation')
|
||||||
const EditNoOperation = require('./lib/operation/edit_no_operation')
|
const EditNoOperation = require('./lib/operation/edit_no_operation')
|
||||||
const EditOperationTransformer = require('./lib/operation/edit_operation_transformer')
|
|
||||||
const SetFileMetadataOperation = require('./lib/operation/set_file_metadata_operation')
|
const SetFileMetadataOperation = require('./lib/operation/set_file_metadata_operation')
|
||||||
const NoOperation = require('./lib/operation/no_operation')
|
const NoOperation = require('./lib/operation/no_operation')
|
||||||
const Operation = require('./lib/operation')
|
const Operation = require('./lib/operation')
|
||||||
|
@ -44,8 +43,6 @@ const TrackingProps = require('./lib/file_data/tracking_props')
|
||||||
const Range = require('./lib/range')
|
const Range = require('./lib/range')
|
||||||
const CommentList = require('./lib/file_data/comment_list')
|
const CommentList = require('./lib/file_data/comment_list')
|
||||||
const LazyStringFileData = require('./lib/file_data/lazy_string_file_data')
|
const LazyStringFileData = require('./lib/file_data/lazy_string_file_data')
|
||||||
const StringFileData = require('./lib/file_data/string_file_data')
|
|
||||||
const EditOperationBuilder = require('./lib/operation/edit_operation_builder')
|
|
||||||
|
|
||||||
exports.AddCommentOperation = AddCommentOperation
|
exports.AddCommentOperation = AddCommentOperation
|
||||||
exports.Author = Author
|
exports.Author = Author
|
||||||
|
@ -61,7 +58,6 @@ exports.DeleteCommentOperation = DeleteCommentOperation
|
||||||
exports.File = File
|
exports.File = File
|
||||||
exports.FileMap = FileMap
|
exports.FileMap = FileMap
|
||||||
exports.LazyStringFileData = LazyStringFileData
|
exports.LazyStringFileData = LazyStringFileData
|
||||||
exports.StringFileData = StringFileData
|
|
||||||
exports.History = History
|
exports.History = History
|
||||||
exports.Label = Label
|
exports.Label = Label
|
||||||
exports.AddFileOperation = AddFileOperation
|
exports.AddFileOperation = AddFileOperation
|
||||||
|
@ -69,8 +65,6 @@ exports.MoveFileOperation = MoveFileOperation
|
||||||
exports.SetCommentStateOperation = SetCommentStateOperation
|
exports.SetCommentStateOperation = SetCommentStateOperation
|
||||||
exports.EditFileOperation = EditFileOperation
|
exports.EditFileOperation = EditFileOperation
|
||||||
exports.EditNoOperation = EditNoOperation
|
exports.EditNoOperation = EditNoOperation
|
||||||
exports.EditOperationBuilder = EditOperationBuilder
|
|
||||||
exports.EditOperationTransformer = EditOperationTransformer
|
|
||||||
exports.SetFileMetadataOperation = SetFileMetadataOperation
|
exports.SetFileMetadataOperation = SetFileMetadataOperation
|
||||||
exports.NoOperation = NoOperation
|
exports.NoOperation = NoOperation
|
||||||
exports.Operation = Operation
|
exports.Operation = Operation
|
||||||
|
|
|
@ -13,7 +13,7 @@ const V2DocVersions = require('./v2_doc_versions')
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @import Author from "./author"
|
* @import Author from "./author"
|
||||||
* @import { BlobStore, RawChange, ReadonlyBlobStore } from "./types"
|
* @import { BlobStore } from "./types"
|
||||||
*/
|
*/
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
@ -54,7 +54,7 @@ class Change {
|
||||||
/**
|
/**
|
||||||
* For serialization.
|
* For serialization.
|
||||||
*
|
*
|
||||||
* @return {RawChange}
|
* @return {Object}
|
||||||
*/
|
*/
|
||||||
toRaw() {
|
toRaw() {
|
||||||
function toRaw(object) {
|
function toRaw(object) {
|
||||||
|
@ -219,7 +219,7 @@ class Change {
|
||||||
* If this Change contains any File objects, load them.
|
* If this Change contains any File objects, load them.
|
||||||
*
|
*
|
||||||
* @param {string} kind see {File#load}
|
* @param {string} kind see {File#load}
|
||||||
* @param {ReadonlyBlobStore} blobStore
|
* @param {BlobStore} blobStore
|
||||||
* @return {Promise<void>}
|
* @return {Promise<void>}
|
||||||
*/
|
*/
|
||||||
async loadFiles(kind, blobStore) {
|
async loadFiles(kind, blobStore) {
|
||||||
|
|
|
@ -1,7 +1,7 @@
|
||||||
// @ts-check
|
// @ts-check
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @import { ClearTrackingPropsRawData, TrackingDirective } from '../types'
|
* @import { ClearTrackingPropsRawData } from '../types'
|
||||||
*/
|
*/
|
||||||
|
|
||||||
class ClearTrackingProps {
|
class ClearTrackingProps {
|
||||||
|
@ -11,27 +11,12 @@ class ClearTrackingProps {
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @param {any} other
|
* @param {any} other
|
||||||
* @returns {other is ClearTrackingProps}
|
* @returns {boolean}
|
||||||
*/
|
*/
|
||||||
equals(other) {
|
equals(other) {
|
||||||
return other instanceof ClearTrackingProps
|
return other instanceof ClearTrackingProps
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
|
||||||
* @param {TrackingDirective} other
|
|
||||||
* @returns {other is ClearTrackingProps}
|
|
||||||
*/
|
|
||||||
canMergeWith(other) {
|
|
||||||
return other instanceof ClearTrackingProps
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* @param {TrackingDirective} other
|
|
||||||
*/
|
|
||||||
mergeWith(other) {
|
|
||||||
return this
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @returns {ClearTrackingPropsRawData}
|
* @returns {ClearTrackingPropsRawData}
|
||||||
*/
|
*/
|
||||||
|
|
|
@ -11,7 +11,7 @@ const EditOperation = require('../operation/edit_operation')
|
||||||
const EditOperationBuilder = require('../operation/edit_operation_builder')
|
const EditOperationBuilder = require('../operation/edit_operation_builder')
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @import { BlobStore, ReadonlyBlobStore, RangesBlob, RawHashFileData, RawLazyStringFileData } from '../types'
|
* @import { BlobStore, ReadonlyBlobStore, RangesBlob, RawFileData, RawLazyStringFileData } from '../types'
|
||||||
*/
|
*/
|
||||||
|
|
||||||
class LazyStringFileData extends FileData {
|
class LazyStringFileData extends FileData {
|
||||||
|
@ -159,11 +159,11 @@ class LazyStringFileData extends FileData {
|
||||||
|
|
||||||
/** @inheritdoc
|
/** @inheritdoc
|
||||||
* @param {BlobStore} blobStore
|
* @param {BlobStore} blobStore
|
||||||
* @return {Promise<RawHashFileData>}
|
* @return {Promise<RawFileData>}
|
||||||
*/
|
*/
|
||||||
async store(blobStore) {
|
async store(blobStore) {
|
||||||
if (this.operations.length === 0) {
|
if (this.operations.length === 0) {
|
||||||
/** @type RawHashFileData */
|
/** @type RawFileData */
|
||||||
const raw = { hash: this.hash }
|
const raw = { hash: this.hash }
|
||||||
if (this.rangesHash) {
|
if (this.rangesHash) {
|
||||||
raw.rangesHash = this.rangesHash
|
raw.rangesHash = this.rangesHash
|
||||||
|
@ -171,11 +171,9 @@ class LazyStringFileData extends FileData {
|
||||||
return raw
|
return raw
|
||||||
}
|
}
|
||||||
const eager = await this.toEager(blobStore)
|
const eager = await this.toEager(blobStore)
|
||||||
const raw = await eager.store(blobStore)
|
|
||||||
this.hash = raw.hash
|
|
||||||
this.rangesHash = raw.rangesHash
|
|
||||||
this.operations.length = 0
|
this.operations.length = 0
|
||||||
return raw
|
/** @type RawFileData */
|
||||||
|
return await eager.store(blobStore)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -8,7 +8,7 @@ const CommentList = require('./comment_list')
|
||||||
const TrackedChangeList = require('./tracked_change_list')
|
const TrackedChangeList = require('./tracked_change_list')
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @import { StringFileRawData, RawHashFileData, BlobStore, CommentRawData } from "../types"
|
* @import { StringFileRawData, RawFileData, BlobStore, CommentRawData } from "../types"
|
||||||
* @import { TrackedChangeRawData, RangesBlob } from "../types"
|
* @import { TrackedChangeRawData, RangesBlob } from "../types"
|
||||||
* @import EditOperation from "../operation/edit_operation"
|
* @import EditOperation from "../operation/edit_operation"
|
||||||
*/
|
*/
|
||||||
|
@ -88,14 +88,6 @@ class StringFileData extends FileData {
|
||||||
return content
|
return content
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
|
||||||
* Return docstore view of a doc: each line separated
|
|
||||||
* @return {string[]}
|
|
||||||
*/
|
|
||||||
getLines() {
|
|
||||||
return this.getContent({ filterTrackedDeletes: true }).split('\n')
|
|
||||||
}
|
|
||||||
|
|
||||||
/** @inheritdoc */
|
/** @inheritdoc */
|
||||||
getByteLength() {
|
getByteLength() {
|
||||||
return Buffer.byteLength(this.content)
|
return Buffer.byteLength(this.content)
|
||||||
|
@ -139,7 +131,7 @@ class StringFileData extends FileData {
|
||||||
/**
|
/**
|
||||||
* @inheritdoc
|
* @inheritdoc
|
||||||
* @param {BlobStore} blobStore
|
* @param {BlobStore} blobStore
|
||||||
* @return {Promise<RawHashFileData>}
|
* @return {Promise<RawFileData>}
|
||||||
*/
|
*/
|
||||||
async store(blobStore) {
|
async store(blobStore) {
|
||||||
const blob = await blobStore.putString(this.content)
|
const blob = await blobStore.putString(this.content)
|
||||||
|
|
|
@ -84,21 +84,6 @@ class TrackedChange {
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
|
||||||
* Return an equivalent tracked change whose extent is limited to the given
|
|
||||||
* range
|
|
||||||
*
|
|
||||||
* @param {Range} range
|
|
||||||
* @returns {TrackedChange | null} - the result or null if the intersection is empty
|
|
||||||
*/
|
|
||||||
intersectRange(range) {
|
|
||||||
const intersection = this.range.intersect(range)
|
|
||||||
if (intersection == null) {
|
|
||||||
return null
|
|
||||||
}
|
|
||||||
return new TrackedChange(intersection, this.tracking)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
module.exports = TrackedChange
|
module.exports = TrackedChange
|
||||||
|
|
|
@ -2,11 +2,9 @@
|
||||||
const Range = require('../range')
|
const Range = require('../range')
|
||||||
const TrackedChange = require('./tracked_change')
|
const TrackedChange = require('./tracked_change')
|
||||||
const TrackingProps = require('../file_data/tracking_props')
|
const TrackingProps = require('../file_data/tracking_props')
|
||||||
const { InsertOp, RemoveOp, RetainOp } = require('../operation/scan_op')
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @import { TrackingDirective, TrackedChangeRawData } from "../types"
|
* @import { TrackingDirective, TrackedChangeRawData } from "../types"
|
||||||
* @import TextOperation from "../operation/text_operation"
|
|
||||||
*/
|
*/
|
||||||
|
|
||||||
class TrackedChangeList {
|
class TrackedChangeList {
|
||||||
|
@ -60,22 +58,6 @@ class TrackedChangeList {
|
||||||
return this._trackedChanges.filter(change => range.contains(change.range))
|
return this._trackedChanges.filter(change => range.contains(change.range))
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
|
||||||
* Returns tracked changes that overlap with the given range
|
|
||||||
* @param {Range} range
|
|
||||||
* @returns {TrackedChange[]}
|
|
||||||
*/
|
|
||||||
intersectRange(range) {
|
|
||||||
const changes = []
|
|
||||||
for (const change of this._trackedChanges) {
|
|
||||||
const intersection = change.intersectRange(range)
|
|
||||||
if (intersection != null) {
|
|
||||||
changes.push(intersection)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return changes
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Returns the tracking props for a given range.
|
* Returns the tracking props for a given range.
|
||||||
* @param {Range} range
|
* @param {Range} range
|
||||||
|
@ -107,8 +89,6 @@ class TrackedChangeList {
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Collapses consecutive (and compatible) ranges
|
* Collapses consecutive (and compatible) ranges
|
||||||
*
|
|
||||||
* @private
|
|
||||||
* @returns {void}
|
* @returns {void}
|
||||||
*/
|
*/
|
||||||
_mergeRanges() {
|
_mergeRanges() {
|
||||||
|
@ -137,28 +117,12 @@ class TrackedChangeList {
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Apply an insert operation
|
|
||||||
*
|
*
|
||||||
* @param {number} cursor
|
* @param {number} cursor
|
||||||
* @param {string} insertedText
|
* @param {string} insertedText
|
||||||
* @param {{tracking?: TrackingProps}} opts
|
* @param {{tracking?: TrackingProps}} opts
|
||||||
*/
|
*/
|
||||||
applyInsert(cursor, insertedText, opts = {}) {
|
applyInsert(cursor, insertedText, opts = {}) {
|
||||||
this._applyInsert(cursor, insertedText, opts)
|
|
||||||
this._mergeRanges()
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Apply an insert operation
|
|
||||||
*
|
|
||||||
* This method will not merge ranges at the end
|
|
||||||
*
|
|
||||||
* @private
|
|
||||||
* @param {number} cursor
|
|
||||||
* @param {string} insertedText
|
|
||||||
* @param {{tracking?: TrackingProps}} [opts]
|
|
||||||
*/
|
|
||||||
_applyInsert(cursor, insertedText, opts = {}) {
|
|
||||||
const newTrackedChanges = []
|
const newTrackedChanges = []
|
||||||
for (const trackedChange of this._trackedChanges) {
|
for (const trackedChange of this._trackedChanges) {
|
||||||
if (
|
if (
|
||||||
|
@ -207,29 +171,15 @@ class TrackedChangeList {
|
||||||
newTrackedChanges.push(newTrackedChange)
|
newTrackedChanges.push(newTrackedChange)
|
||||||
}
|
}
|
||||||
this._trackedChanges = newTrackedChanges
|
this._trackedChanges = newTrackedChanges
|
||||||
|
this._mergeRanges()
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Apply a delete operation to the list of tracked changes
|
|
||||||
*
|
*
|
||||||
* @param {number} cursor
|
* @param {number} cursor
|
||||||
* @param {number} length
|
* @param {number} length
|
||||||
*/
|
*/
|
||||||
applyDelete(cursor, length) {
|
applyDelete(cursor, length) {
|
||||||
this._applyDelete(cursor, length)
|
|
||||||
this._mergeRanges()
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Apply a delete operation to the list of tracked changes
|
|
||||||
*
|
|
||||||
* This method will not merge ranges at the end
|
|
||||||
*
|
|
||||||
* @private
|
|
||||||
* @param {number} cursor
|
|
||||||
* @param {number} length
|
|
||||||
*/
|
|
||||||
_applyDelete(cursor, length) {
|
|
||||||
const newTrackedChanges = []
|
const newTrackedChanges = []
|
||||||
for (const trackedChange of this._trackedChanges) {
|
for (const trackedChange of this._trackedChanges) {
|
||||||
const deletedRange = new Range(cursor, length)
|
const deletedRange = new Range(cursor, length)
|
||||||
|
@ -255,31 +205,15 @@ class TrackedChangeList {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
this._trackedChanges = newTrackedChanges
|
this._trackedChanges = newTrackedChanges
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Apply a retain operation to the list of tracked changes
|
|
||||||
*
|
|
||||||
* @param {number} cursor
|
|
||||||
* @param {number} length
|
|
||||||
* @param {{tracking?: TrackingDirective}} [opts]
|
|
||||||
*/
|
|
||||||
applyRetain(cursor, length, opts = {}) {
|
|
||||||
this._applyRetain(cursor, length, opts)
|
|
||||||
this._mergeRanges()
|
this._mergeRanges()
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Apply a retain operation to the list of tracked changes
|
|
||||||
*
|
|
||||||
* This method will not merge ranges at the end
|
|
||||||
*
|
|
||||||
* @private
|
|
||||||
* @param {number} cursor
|
* @param {number} cursor
|
||||||
* @param {number} length
|
* @param {number} length
|
||||||
* @param {{tracking?: TrackingDirective}} opts
|
* @param {{tracking?: TrackingDirective}} opts
|
||||||
*/
|
*/
|
||||||
_applyRetain(cursor, length, opts = {}) {
|
applyRetain(cursor, length, opts = {}) {
|
||||||
// If there's no tracking info, leave everything as-is
|
// If there's no tracking info, leave everything as-is
|
||||||
if (!opts.tracking) {
|
if (!opts.tracking) {
|
||||||
return
|
return
|
||||||
|
@ -335,31 +269,6 @@ class TrackedChangeList {
|
||||||
newTrackedChanges.push(newTrackedChange)
|
newTrackedChanges.push(newTrackedChange)
|
||||||
}
|
}
|
||||||
this._trackedChanges = newTrackedChanges
|
this._trackedChanges = newTrackedChanges
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Apply a text operation to the list of tracked changes
|
|
||||||
*
|
|
||||||
* Ranges are merged only once at the end, for performance and to avoid
|
|
||||||
* problematic edge cases where intermediate ranges get incorrectly merged.
|
|
||||||
*
|
|
||||||
* @param {TextOperation} operation
|
|
||||||
*/
|
|
||||||
applyTextOperation(operation) {
|
|
||||||
// this cursor tracks the destination document that gets modified as
|
|
||||||
// operations are applied to it.
|
|
||||||
let cursor = 0
|
|
||||||
for (const op of operation.ops) {
|
|
||||||
if (op instanceof InsertOp) {
|
|
||||||
this._applyInsert(cursor, op.insertion, { tracking: op.tracking })
|
|
||||||
cursor += op.insertion.length
|
|
||||||
} else if (op instanceof RemoveOp) {
|
|
||||||
this._applyDelete(cursor, op.length)
|
|
||||||
} else if (op instanceof RetainOp) {
|
|
||||||
this._applyRetain(cursor, op.length, { tracking: op.tracking })
|
|
||||||
cursor += op.length
|
|
||||||
}
|
|
||||||
}
|
|
||||||
this._mergeRanges()
|
this._mergeRanges()
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -62,35 +62,6 @@ class TrackingProps {
|
||||||
this.ts.getTime() === other.ts.getTime()
|
this.ts.getTime() === other.ts.getTime()
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
|
||||||
* Are these tracking props compatible with the other tracking props for merging
|
|
||||||
* ranges?
|
|
||||||
*
|
|
||||||
* @param {TrackingDirective} other
|
|
||||||
* @returns {other is TrackingProps}
|
|
||||||
*/
|
|
||||||
canMergeWith(other) {
|
|
||||||
if (!(other instanceof TrackingProps)) {
|
|
||||||
return false
|
|
||||||
}
|
|
||||||
return this.type === other.type && this.userId === other.userId
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Merge two tracking props
|
|
||||||
*
|
|
||||||
* Assumes that `canMerge(other)` returns true
|
|
||||||
*
|
|
||||||
* @param {TrackingDirective} other
|
|
||||||
*/
|
|
||||||
mergeWith(other) {
|
|
||||||
if (!this.canMergeWith(other)) {
|
|
||||||
throw new Error('Cannot merge with incompatible tracking props')
|
|
||||||
}
|
|
||||||
const ts = this.ts <= other.ts ? this.ts : other.ts
|
|
||||||
return new TrackingProps(this.type, this.userId, ts)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
module.exports = TrackingProps
|
module.exports = TrackingProps
|
||||||
|
|
|
@ -7,7 +7,7 @@ const Change = require('./change')
|
||||||
const Snapshot = require('./snapshot')
|
const Snapshot = require('./snapshot')
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @import { BlobStore, ReadonlyBlobStore } from "./types"
|
* @import { BlobStore } from "./types"
|
||||||
*/
|
*/
|
||||||
|
|
||||||
class History {
|
class History {
|
||||||
|
@ -85,7 +85,7 @@ class History {
|
||||||
* If this History contains any File objects, load them.
|
* If this History contains any File objects, load them.
|
||||||
*
|
*
|
||||||
* @param {string} kind see {File#load}
|
* @param {string} kind see {File#load}
|
||||||
* @param {ReadonlyBlobStore} blobStore
|
* @param {BlobStore} blobStore
|
||||||
* @return {Promise<void>}
|
* @return {Promise<void>}
|
||||||
*/
|
*/
|
||||||
async loadFiles(kind, blobStore) {
|
async loadFiles(kind, blobStore) {
|
||||||
|
|
|
@ -36,20 +36,6 @@ class EditOperationBuilder {
|
||||||
}
|
}
|
||||||
throw new Error('Unsupported operation in EditOperationBuilder.fromJSON')
|
throw new Error('Unsupported operation in EditOperationBuilder.fromJSON')
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
|
||||||
* @param {unknown} raw
|
|
||||||
* @return {raw is RawEditOperation}
|
|
||||||
*/
|
|
||||||
static isValid(raw) {
|
|
||||||
return (
|
|
||||||
isTextOperation(raw) ||
|
|
||||||
isRawAddCommentOperation(raw) ||
|
|
||||||
isRawDeleteCommentOperation(raw) ||
|
|
||||||
isRawSetCommentStateOperation(raw) ||
|
|
||||||
isRawEditNoOperation(raw)
|
|
||||||
)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|
|
@ -13,7 +13,7 @@ let EditFileOperation = null
|
||||||
let SetFileMetadataOperation = null
|
let SetFileMetadataOperation = null
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @import { ReadonlyBlobStore } from "../types"
|
* @import { BlobStore } from "../types"
|
||||||
* @import Snapshot from "../snapshot"
|
* @import Snapshot from "../snapshot"
|
||||||
*/
|
*/
|
||||||
|
|
||||||
|
@ -80,7 +80,7 @@ class Operation {
|
||||||
* If this operation references any files, load the files.
|
* If this operation references any files, load the files.
|
||||||
*
|
*
|
||||||
* @param {string} kind see {File#load}
|
* @param {string} kind see {File#load}
|
||||||
* @param {ReadOnlyBlobStore} blobStore
|
* @param {BlobStore} blobStore
|
||||||
* @return {Promise<void>}
|
* @return {Promise<void>}
|
||||||
*/
|
*/
|
||||||
async loadFiles(kind, blobStore) {}
|
async loadFiles(kind, blobStore) {}
|
||||||
|
|
|
@ -175,7 +175,7 @@ class InsertOp extends ScanOp {
|
||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
if (this.tracking) {
|
if (this.tracking) {
|
||||||
if (!other.tracking || !this.tracking.canMergeWith(other.tracking)) {
|
if (!this.tracking.equals(other.tracking)) {
|
||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
} else if (other.tracking) {
|
} else if (other.tracking) {
|
||||||
|
@ -198,10 +198,7 @@ class InsertOp extends ScanOp {
|
||||||
throw new Error('Cannot merge with incompatible operation')
|
throw new Error('Cannot merge with incompatible operation')
|
||||||
}
|
}
|
||||||
this.insertion += other.insertion
|
this.insertion += other.insertion
|
||||||
if (this.tracking != null && other.tracking != null) {
|
// We already have the same tracking info and commentIds
|
||||||
this.tracking = this.tracking.mergeWith(other.tracking)
|
|
||||||
}
|
|
||||||
// We already have the same commentIds
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
@ -309,13 +306,9 @@ class RetainOp extends ScanOp {
|
||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
if (this.tracking) {
|
if (this.tracking) {
|
||||||
if (!other.tracking || !this.tracking.canMergeWith(other.tracking)) {
|
return this.tracking.equals(other.tracking)
|
||||||
return false
|
|
||||||
}
|
|
||||||
} else if (other.tracking) {
|
|
||||||
return false
|
|
||||||
}
|
}
|
||||||
return true
|
return !other.tracking
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
@ -326,9 +319,6 @@ class RetainOp extends ScanOp {
|
||||||
throw new Error('Cannot merge with incompatible operation')
|
throw new Error('Cannot merge with incompatible operation')
|
||||||
}
|
}
|
||||||
this.length += other.length
|
this.length += other.length
|
||||||
if (this.tracking != null && other.tracking != null) {
|
|
||||||
this.tracking = this.tracking.mergeWith(other.tracking)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|
|
@ -314,18 +314,25 @@ class TextOperation extends EditOperation {
|
||||||
str
|
str
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
file.trackedChanges.applyRetain(result.length, op.length, {
|
||||||
|
tracking: op.tracking,
|
||||||
|
})
|
||||||
result += str.slice(inputCursor, inputCursor + op.length)
|
result += str.slice(inputCursor, inputCursor + op.length)
|
||||||
inputCursor += op.length
|
inputCursor += op.length
|
||||||
} else if (op instanceof InsertOp) {
|
} else if (op instanceof InsertOp) {
|
||||||
if (containsNonBmpChars(op.insertion)) {
|
if (containsNonBmpChars(op.insertion)) {
|
||||||
throw new InvalidInsertionError(str, op.toJSON())
|
throw new InvalidInsertionError(str, op.toJSON())
|
||||||
}
|
}
|
||||||
|
file.trackedChanges.applyInsert(result.length, op.insertion, {
|
||||||
|
tracking: op.tracking,
|
||||||
|
})
|
||||||
file.comments.applyInsert(
|
file.comments.applyInsert(
|
||||||
new Range(result.length, op.insertion.length),
|
new Range(result.length, op.insertion.length),
|
||||||
{ commentIds: op.commentIds }
|
{ commentIds: op.commentIds }
|
||||||
)
|
)
|
||||||
result += op.insertion
|
result += op.insertion
|
||||||
} else if (op instanceof RemoveOp) {
|
} else if (op instanceof RemoveOp) {
|
||||||
|
file.trackedChanges.applyDelete(result.length, op.length)
|
||||||
file.comments.applyDelete(new Range(result.length, op.length))
|
file.comments.applyDelete(new Range(result.length, op.length))
|
||||||
inputCursor += op.length
|
inputCursor += op.length
|
||||||
} else {
|
} else {
|
||||||
|
@ -345,8 +352,6 @@ class TextOperation extends EditOperation {
|
||||||
throw new TextOperation.TooLongError(operation, result.length)
|
throw new TextOperation.TooLongError(operation, result.length)
|
||||||
}
|
}
|
||||||
|
|
||||||
file.trackedChanges.applyTextOperation(this)
|
|
||||||
|
|
||||||
file.content = result
|
file.content = result
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -395,36 +400,44 @@ class TextOperation extends EditOperation {
|
||||||
for (let i = 0, l = ops.length; i < l; i++) {
|
for (let i = 0, l = ops.length; i < l; i++) {
|
||||||
const op = ops[i]
|
const op = ops[i]
|
||||||
if (op instanceof RetainOp) {
|
if (op instanceof RetainOp) {
|
||||||
if (op.tracking) {
|
// Where we need to end up after the retains
|
||||||
// Where we need to end up after the retains
|
const target = strIndex + op.length
|
||||||
const target = strIndex + op.length
|
// A previous retain could have overriden some tracking info. Now we
|
||||||
// A previous retain could have overriden some tracking info. Now we
|
// need to restore it.
|
||||||
// need to restore it.
|
const previousRanges = previousState.trackedChanges.inRange(
|
||||||
const previousChanges = previousState.trackedChanges.intersectRange(
|
new Range(strIndex, op.length)
|
||||||
new Range(strIndex, op.length)
|
)
|
||||||
)
|
|
||||||
|
|
||||||
for (const change of previousChanges) {
|
let removeTrackingInfoIfNeeded
|
||||||
if (strIndex < change.range.start) {
|
if (op.tracking) {
|
||||||
inverse.retain(change.range.start - strIndex, {
|
removeTrackingInfoIfNeeded = new ClearTrackingProps()
|
||||||
tracking: new ClearTrackingProps(),
|
}
|
||||||
})
|
|
||||||
strIndex = change.range.start
|
for (const trackedChange of previousRanges) {
|
||||||
}
|
if (strIndex < trackedChange.range.start) {
|
||||||
inverse.retain(change.range.length, {
|
inverse.retain(trackedChange.range.start - strIndex, {
|
||||||
tracking: change.tracking,
|
tracking: removeTrackingInfoIfNeeded,
|
||||||
})
|
})
|
||||||
strIndex += change.range.length
|
strIndex = trackedChange.range.start
|
||||||
}
|
}
|
||||||
if (strIndex < target) {
|
if (trackedChange.range.end < strIndex + op.length) {
|
||||||
inverse.retain(target - strIndex, {
|
inverse.retain(trackedChange.range.length, {
|
||||||
tracking: new ClearTrackingProps(),
|
tracking: trackedChange.tracking,
|
||||||
})
|
})
|
||||||
strIndex = target
|
strIndex = trackedChange.range.end
|
||||||
}
|
}
|
||||||
} else {
|
if (trackedChange.range.end !== strIndex) {
|
||||||
inverse.retain(op.length)
|
// No need to split the range at the end
|
||||||
strIndex += op.length
|
const [left] = trackedChange.range.splitAt(strIndex)
|
||||||
|
inverse.retain(left.length, { tracking: trackedChange.tracking })
|
||||||
|
strIndex = left.end
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (strIndex < target) {
|
||||||
|
inverse.retain(target - strIndex, {
|
||||||
|
tracking: removeTrackingInfoIfNeeded,
|
||||||
|
})
|
||||||
|
strIndex = target
|
||||||
}
|
}
|
||||||
} else if (op instanceof InsertOp) {
|
} else if (op instanceof InsertOp) {
|
||||||
inverse.remove(op.insertion.length)
|
inverse.remove(op.insertion.length)
|
||||||
|
|
|
@ -86,32 +86,10 @@ class Range {
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Does this range overlap another range?
|
* @param {Range} range
|
||||||
*
|
|
||||||
* Overlapping means that the two ranges have at least one character in common
|
|
||||||
*
|
|
||||||
* @param {Range} other - the other range
|
|
||||||
*/
|
*/
|
||||||
overlaps(other) {
|
overlaps(range) {
|
||||||
return this.start < other.end && this.end > other.start
|
return this.start < range.end && this.end > range.start
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Does this range overlap the start of another range?
|
|
||||||
*
|
|
||||||
* @param {Range} other - the other range
|
|
||||||
*/
|
|
||||||
overlapsStart(other) {
|
|
||||||
return this.start <= other.start && this.end > other.start
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Does this range overlap the end of another range?
|
|
||||||
*
|
|
||||||
* @param {Range} other - the other range
|
|
||||||
*/
|
|
||||||
overlapsEnd(other) {
|
|
||||||
return this.start < other.end && this.end >= other.end
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
@ -249,26 +227,6 @@ class Range {
|
||||||
)
|
)
|
||||||
return [rangeUpToCursor, rangeAfterCursor]
|
return [rangeUpToCursor, rangeAfterCursor]
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
|
||||||
* Returns the intersection of this range with another range
|
|
||||||
*
|
|
||||||
* @param {Range} other - the other range
|
|
||||||
* @return {Range | null} the intersection or null if the intersection is empty
|
|
||||||
*/
|
|
||||||
intersect(other) {
|
|
||||||
if (this.contains(other)) {
|
|
||||||
return other
|
|
||||||
} else if (other.contains(this)) {
|
|
||||||
return this
|
|
||||||
} else if (other.overlapsStart(this)) {
|
|
||||||
return new Range(this.pos, other.end - this.start)
|
|
||||||
} else if (other.overlapsEnd(this)) {
|
|
||||||
return new Range(other.pos, this.end - other.start)
|
|
||||||
} else {
|
|
||||||
return null
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
module.exports = Range
|
module.exports = Range
|
||||||
|
|
|
@ -193,13 +193,4 @@ describe('LazyStringFileData', function () {
|
||||||
expect(fileData.getStringLength()).to.equal(longString.length)
|
expect(fileData.getStringLength()).to.equal(longString.length)
|
||||||
expect(fileData.getOperations()).to.have.length(1)
|
expect(fileData.getOperations()).to.have.length(1)
|
||||||
})
|
})
|
||||||
|
|
||||||
it('truncates its operations after being stored', async function () {
|
|
||||||
const testHash = File.EMPTY_FILE_HASH
|
|
||||||
const fileData = new LazyStringFileData(testHash, undefined, 0)
|
|
||||||
fileData.edit(new TextOperation().insert('abc'))
|
|
||||||
const stored = await fileData.store(this.blobStore)
|
|
||||||
expect(fileData.hash).to.equal(stored.hash)
|
|
||||||
expect(fileData.operations).to.deep.equal([])
|
|
||||||
})
|
|
||||||
})
|
})
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
// @ts-check
|
||||||
'use strict'
|
'use strict'
|
||||||
|
|
||||||
const { expect } = require('chai')
|
const { expect } = require('chai')
|
||||||
|
@ -448,44 +449,4 @@ describe('Range', function () {
|
||||||
expect(() => range.insertAt(16, 3)).to.throw()
|
expect(() => range.insertAt(16, 3)).to.throw()
|
||||||
})
|
})
|
||||||
})
|
})
|
||||||
|
|
||||||
describe('intersect', function () {
|
|
||||||
it('should handle partially overlapping ranges', function () {
|
|
||||||
const range1 = new Range(5, 10)
|
|
||||||
const range2 = new Range(3, 6)
|
|
||||||
const intersection1 = range1.intersect(range2)
|
|
||||||
expect(intersection1.pos).to.equal(5)
|
|
||||||
expect(intersection1.length).to.equal(4)
|
|
||||||
const intersection2 = range2.intersect(range1)
|
|
||||||
expect(intersection2.pos).to.equal(5)
|
|
||||||
expect(intersection2.length).to.equal(4)
|
|
||||||
})
|
|
||||||
|
|
||||||
it('should intersect with itself', function () {
|
|
||||||
const range = new Range(5, 10)
|
|
||||||
const intersection = range.intersect(range)
|
|
||||||
expect(intersection.pos).to.equal(5)
|
|
||||||
expect(intersection.length).to.equal(10)
|
|
||||||
})
|
|
||||||
|
|
||||||
it('should handle nested ranges', function () {
|
|
||||||
const range1 = new Range(5, 10)
|
|
||||||
const range2 = new Range(7, 2)
|
|
||||||
const intersection1 = range1.intersect(range2)
|
|
||||||
expect(intersection1.pos).to.equal(7)
|
|
||||||
expect(intersection1.length).to.equal(2)
|
|
||||||
const intersection2 = range2.intersect(range1)
|
|
||||||
expect(intersection2.pos).to.equal(7)
|
|
||||||
expect(intersection2.length).to.equal(2)
|
|
||||||
})
|
|
||||||
|
|
||||||
it('should handle disconnected ranges', function () {
|
|
||||||
const range1 = new Range(5, 10)
|
|
||||||
const range2 = new Range(20, 30)
|
|
||||||
const intersection1 = range1.intersect(range2)
|
|
||||||
expect(intersection1).to.be.null
|
|
||||||
const intersection2 = range2.intersect(range1)
|
|
||||||
expect(intersection2).to.be.null
|
|
||||||
})
|
|
||||||
})
|
|
||||||
})
|
})
|
||||||
|
|
|
@ -107,7 +107,7 @@ describe('RetainOp', function () {
|
||||||
expect(op1.equals(new RetainOp(3))).to.be.true
|
expect(op1.equals(new RetainOp(3))).to.be.true
|
||||||
})
|
})
|
||||||
|
|
||||||
it('cannot merge with another RetainOp if the tracking user is different', function () {
|
it('cannot merge with another RetainOp if tracking info is different', function () {
|
||||||
const op1 = new RetainOp(
|
const op1 = new RetainOp(
|
||||||
4,
|
4,
|
||||||
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
|
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
|
||||||
|
@ -120,14 +120,14 @@ describe('RetainOp', function () {
|
||||||
expect(() => op1.mergeWith(op2)).to.throw(Error)
|
expect(() => op1.mergeWith(op2)).to.throw(Error)
|
||||||
})
|
})
|
||||||
|
|
||||||
it('can merge with another RetainOp if the tracking user is the same', function () {
|
it('can merge with another RetainOp if tracking info is the same', function () {
|
||||||
const op1 = new RetainOp(
|
const op1 = new RetainOp(
|
||||||
4,
|
4,
|
||||||
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
|
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
|
||||||
)
|
)
|
||||||
const op2 = new RetainOp(
|
const op2 = new RetainOp(
|
||||||
4,
|
4,
|
||||||
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:01.000Z'))
|
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
|
||||||
)
|
)
|
||||||
op1.mergeWith(op2)
|
op1.mergeWith(op2)
|
||||||
expect(
|
expect(
|
||||||
|
@ -310,7 +310,7 @@ describe('InsertOp', function () {
|
||||||
expect(() => op1.mergeWith(op2)).to.throw(Error)
|
expect(() => op1.mergeWith(op2)).to.throw(Error)
|
||||||
})
|
})
|
||||||
|
|
||||||
it('cannot merge with another InsertOp if tracking user is different', function () {
|
it('cannot merge with another InsertOp if tracking info is different', function () {
|
||||||
const op1 = new InsertOp(
|
const op1 = new InsertOp(
|
||||||
'a',
|
'a',
|
||||||
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
|
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
|
||||||
|
@ -323,7 +323,7 @@ describe('InsertOp', function () {
|
||||||
expect(() => op1.mergeWith(op2)).to.throw(Error)
|
expect(() => op1.mergeWith(op2)).to.throw(Error)
|
||||||
})
|
})
|
||||||
|
|
||||||
it('can merge with another InsertOp if tracking user and comment info is the same', function () {
|
it('can merge with another InsertOp if tracking and comment info is the same', function () {
|
||||||
const op1 = new InsertOp(
|
const op1 = new InsertOp(
|
||||||
'a',
|
'a',
|
||||||
new TrackingProps(
|
new TrackingProps(
|
||||||
|
@ -338,7 +338,7 @@ describe('InsertOp', function () {
|
||||||
new TrackingProps(
|
new TrackingProps(
|
||||||
'insert',
|
'insert',
|
||||||
'user1',
|
'user1',
|
||||||
new Date('2024-01-01T00:00:01.000Z')
|
new Date('2024-01-01T00:00:00.000Z')
|
||||||
),
|
),
|
||||||
['1', '2']
|
['1', '2']
|
||||||
)
|
)
|
||||||
|
|
|
@ -322,47 +322,6 @@ describe('TextOperation', function () {
|
||||||
new TextOperation().retain(4).remove(4).retain(3)
|
new TextOperation().retain(4).remove(4).retain(3)
|
||||||
)
|
)
|
||||||
})
|
})
|
||||||
|
|
||||||
it('undoing a tracked delete restores the tracked changes', function () {
|
|
||||||
expectInverseToLeadToInitialState(
|
|
||||||
new StringFileData(
|
|
||||||
'the quick brown fox jumps over the lazy dog',
|
|
||||||
undefined,
|
|
||||||
[
|
|
||||||
{
|
|
||||||
range: { pos: 5, length: 5 },
|
|
||||||
tracking: {
|
|
||||||
ts: '2023-01-01T00:00:00.000Z',
|
|
||||||
type: 'insert',
|
|
||||||
userId: 'user1',
|
|
||||||
},
|
|
||||||
},
|
|
||||||
{
|
|
||||||
range: { pos: 12, length: 3 },
|
|
||||||
tracking: {
|
|
||||||
ts: '2023-01-01T00:00:00.000Z',
|
|
||||||
type: 'delete',
|
|
||||||
userId: 'user1',
|
|
||||||
},
|
|
||||||
},
|
|
||||||
{
|
|
||||||
range: { pos: 18, length: 5 },
|
|
||||||
tracking: {
|
|
||||||
ts: '2023-01-01T00:00:00.000Z',
|
|
||||||
type: 'insert',
|
|
||||||
userId: 'user1',
|
|
||||||
},
|
|
||||||
},
|
|
||||||
]
|
|
||||||
),
|
|
||||||
new TextOperation()
|
|
||||||
.retain(7)
|
|
||||||
.retain(13, {
|
|
||||||
tracking: new TrackingProps('delete', 'user1', new Date()),
|
|
||||||
})
|
|
||||||
.retain(23)
|
|
||||||
)
|
|
||||||
})
|
|
||||||
})
|
})
|
||||||
|
|
||||||
describe('compose', function () {
|
describe('compose', function () {
|
||||||
|
|
1
libraries/promise-utils/.dockerignore
Normal file
1
libraries/promise-utils/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
||||||
|
node_modules/
|
3
libraries/promise-utils/.gitignore
vendored
Normal file
3
libraries/promise-utils/.gitignore
vendored
Normal file
|
@ -0,0 +1,3 @@
|
||||||
|
|
||||||
|
# managed by monorepo$ bin/update_build_scripts
|
||||||
|
.npmrc
|
|
@ -1 +1 @@
|
||||||
22.17.0
|
20.18.2
|
||||||
|
|
|
@ -1,10 +1,10 @@
|
||||||
promise-utils
|
promise-utils
|
||||||
--dependencies=None
|
--dependencies=None
|
||||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
--docker-repos=gcr.io/overleaf-ops
|
||||||
--env-add=
|
--env-add=
|
||||||
--env-pass-through=
|
--env-pass-through=
|
||||||
--esmock-loader=False
|
--esmock-loader=False
|
||||||
--is-library=True
|
--is-library=True
|
||||||
--node-version=22.17.0
|
--node-version=20.18.2
|
||||||
--public-repo=False
|
--public-repo=False
|
||||||
--script-version=4.7.0
|
--script-version=4.5.0
|
||||||
|
|
1
libraries/ranges-tracker/.dockerignore
Normal file
1
libraries/ranges-tracker/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
||||||
|
node_modules/
|
13
libraries/ranges-tracker/.gitignore
vendored
Normal file
13
libraries/ranges-tracker/.gitignore
vendored
Normal file
|
@ -0,0 +1,13 @@
|
||||||
|
**.swp
|
||||||
|
|
||||||
|
app.js
|
||||||
|
app/js/
|
||||||
|
test/unit/js/
|
||||||
|
public/build/
|
||||||
|
|
||||||
|
node_modules/
|
||||||
|
|
||||||
|
/public/js/chat.js
|
||||||
|
plato/
|
||||||
|
|
||||||
|
.npmrc
|
|
@ -1 +1 @@
|
||||||
22.17.0
|
20.18.2
|
||||||
|
|
|
@ -1,10 +1,10 @@
|
||||||
ranges-tracker
|
ranges-tracker
|
||||||
--dependencies=None
|
--dependencies=None
|
||||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
--docker-repos=gcr.io/overleaf-ops
|
||||||
--env-add=
|
--env-add=
|
||||||
--env-pass-through=
|
--env-pass-through=
|
||||||
--esmock-loader=False
|
--esmock-loader=False
|
||||||
--is-library=True
|
--is-library=True
|
||||||
--node-version=22.17.0
|
--node-version=20.18.2
|
||||||
--public-repo=False
|
--public-repo=False
|
||||||
--script-version=4.7.0
|
--script-version=4.5.0
|
||||||
|
|
1
libraries/redis-wrapper/.dockerignore
Normal file
1
libraries/redis-wrapper/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
||||||
|
node_modules/
|
13
libraries/redis-wrapper/.gitignore
vendored
Normal file
13
libraries/redis-wrapper/.gitignore
vendored
Normal file
|
@ -0,0 +1,13 @@
|
||||||
|
**.swp
|
||||||
|
|
||||||
|
app.js
|
||||||
|
app/js/
|
||||||
|
test/unit/js/
|
||||||
|
public/build/
|
||||||
|
|
||||||
|
node_modules/
|
||||||
|
|
||||||
|
/public/js/chat.js
|
||||||
|
plato/
|
||||||
|
|
||||||
|
.npmrc
|
|
@ -1 +1 @@
|
||||||
22.17.0
|
20.18.2
|
||||||
|
|
|
@ -97,8 +97,7 @@ module.exports = class RedisLocker {
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @param {string} id
|
* @param {Callback} callback
|
||||||
* @param {function(Error, boolean, string): void} callback
|
|
||||||
*/
|
*/
|
||||||
tryLock(id, callback) {
|
tryLock(id, callback) {
|
||||||
if (callback == null) {
|
if (callback == null) {
|
||||||
|
@ -107,7 +106,7 @@ module.exports = class RedisLocker {
|
||||||
const lockValue = this.randomLock()
|
const lockValue = this.randomLock()
|
||||||
const key = this.getKey(id)
|
const key = this.getKey(id)
|
||||||
const startTime = Date.now()
|
const startTime = Date.now()
|
||||||
this.rclient.set(
|
return this.rclient.set(
|
||||||
key,
|
key,
|
||||||
lockValue,
|
lockValue,
|
||||||
'EX',
|
'EX',
|
||||||
|
@ -122,7 +121,7 @@ module.exports = class RedisLocker {
|
||||||
const timeTaken = Date.now() - startTime
|
const timeTaken = Date.now() - startTime
|
||||||
if (timeTaken > MAX_REDIS_REQUEST_LENGTH) {
|
if (timeTaken > MAX_REDIS_REQUEST_LENGTH) {
|
||||||
// took too long, so try to free the lock
|
// took too long, so try to free the lock
|
||||||
this.releaseLock(id, lockValue, function (err, result) {
|
return this.releaseLock(id, lockValue, function (err, result) {
|
||||||
if (err != null) {
|
if (err != null) {
|
||||||
return callback(err)
|
return callback(err)
|
||||||
} // error freeing lock
|
} // error freeing lock
|
||||||
|
@ -140,8 +139,7 @@ module.exports = class RedisLocker {
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @param {string} id
|
* @param {Callback} callback
|
||||||
* @param {function(Error, string): void} callback
|
|
||||||
*/
|
*/
|
||||||
getLock(id, callback) {
|
getLock(id, callback) {
|
||||||
if (callback == null) {
|
if (callback == null) {
|
||||||
|
@ -155,7 +153,7 @@ module.exports = class RedisLocker {
|
||||||
return callback(e)
|
return callback(e)
|
||||||
}
|
}
|
||||||
|
|
||||||
this.tryLock(id, (error, gotLock, lockValue) => {
|
return this.tryLock(id, (error, gotLock, lockValue) => {
|
||||||
if (error != null) {
|
if (error != null) {
|
||||||
return callback(error)
|
return callback(error)
|
||||||
}
|
}
|
||||||
|
@ -175,15 +173,14 @@ module.exports = class RedisLocker {
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @param {string} id
|
* @param {Callback} callback
|
||||||
* @param {function(Error, boolean): void} callback
|
|
||||||
*/
|
*/
|
||||||
checkLock(id, callback) {
|
checkLock(id, callback) {
|
||||||
if (callback == null) {
|
if (callback == null) {
|
||||||
callback = function () {}
|
callback = function () {}
|
||||||
}
|
}
|
||||||
const key = this.getKey(id)
|
const key = this.getKey(id)
|
||||||
this.rclient.exists(key, (err, exists) => {
|
return this.rclient.exists(key, (err, exists) => {
|
||||||
if (err != null) {
|
if (err != null) {
|
||||||
return callback(err)
|
return callback(err)
|
||||||
}
|
}
|
||||||
|
@ -199,26 +196,30 @@ module.exports = class RedisLocker {
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* @param {string} id
|
* @param {Callback} callback
|
||||||
* @param {string} lockValue
|
|
||||||
* @param {function(Error, boolean): void} callback
|
|
||||||
*/
|
*/
|
||||||
releaseLock(id, lockValue, callback) {
|
releaseLock(id, lockValue, callback) {
|
||||||
const key = this.getKey(id)
|
const key = this.getKey(id)
|
||||||
this.rclient.eval(UNLOCK_SCRIPT, 1, key, lockValue, (err, result) => {
|
return this.rclient.eval(
|
||||||
if (err != null) {
|
UNLOCK_SCRIPT,
|
||||||
return callback(err)
|
1,
|
||||||
} else if (result != null && result !== 1) {
|
key,
|
||||||
// successful unlock should release exactly one key
|
lockValue,
|
||||||
logger.error(
|
(err, result) => {
|
||||||
{ id, key, lockValue, redis_err: err, redis_result: result },
|
if (err != null) {
|
||||||
'unlocking error'
|
return callback(err)
|
||||||
)
|
} else if (result != null && result !== 1) {
|
||||||
metrics.inc(this.metricsPrefix + '-unlock-error')
|
// successful unlock should release exactly one key
|
||||||
return callback(new Error('tried to release timed out lock'))
|
logger.error(
|
||||||
} else {
|
{ id, key, lockValue, redis_err: err, redis_result: result },
|
||||||
return callback(null, result)
|
'unlocking error'
|
||||||
|
)
|
||||||
|
metrics.inc(this.metricsPrefix + '-unlock-error')
|
||||||
|
return callback(new Error('tried to release timed out lock'))
|
||||||
|
} else {
|
||||||
|
return callback(null, result)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
})
|
)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,10 +1,10 @@
|
||||||
redis-wrapper
|
redis-wrapper
|
||||||
--dependencies=None
|
--dependencies=None
|
||||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
--docker-repos=gcr.io/overleaf-ops
|
||||||
--env-add=
|
--env-add=
|
||||||
--env-pass-through=
|
--env-pass-through=
|
||||||
--esmock-loader=False
|
--esmock-loader=False
|
||||||
--is-library=True
|
--is-library=True
|
||||||
--node-version=22.17.0
|
--node-version=20.18.2
|
||||||
--public-repo=False
|
--public-repo=False
|
||||||
--script-version=4.7.0
|
--script-version=4.5.0
|
||||||
|
|
1
libraries/settings/.dockerignore
Normal file
1
libraries/settings/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
||||||
|
node_modules/
|
5
libraries/settings/.gitignore
vendored
Normal file
5
libraries/settings/.gitignore
vendored
Normal file
|
@ -0,0 +1,5 @@
|
||||||
|
/.npmrc
|
||||||
|
/node_modules
|
||||||
|
|
||||||
|
# managed by monorepo$ bin/update_build_scripts
|
||||||
|
.npmrc
|
|
@ -1 +1 @@
|
||||||
22.17.0
|
20.18.2
|
||||||
|
|
|
@ -1,10 +1,10 @@
|
||||||
settings
|
settings
|
||||||
--dependencies=None
|
--dependencies=None
|
||||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
--docker-repos=gcr.io/overleaf-ops
|
||||||
--env-add=
|
--env-add=
|
||||||
--env-pass-through=
|
--env-pass-through=
|
||||||
--esmock-loader=False
|
--esmock-loader=False
|
||||||
--is-library=True
|
--is-library=True
|
||||||
--node-version=22.17.0
|
--node-version=20.18.2
|
||||||
--public-repo=False
|
--public-repo=False
|
||||||
--script-version=4.7.0
|
--script-version=4.5.0
|
||||||
|
|
1
libraries/stream-utils/.dockerignore
Normal file
1
libraries/stream-utils/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
||||||
|
node_modules/
|
3
libraries/stream-utils/.gitignore
vendored
Normal file
3
libraries/stream-utils/.gitignore
vendored
Normal file
|
@ -0,0 +1,3 @@
|
||||||
|
|
||||||
|
# managed by monorepo$ bin/update_build_scripts
|
||||||
|
.npmrc
|
|
@ -1 +1 @@
|
||||||
22.17.0
|
20.18.2
|
||||||
|
|
|
@ -1,10 +1,10 @@
|
||||||
stream-utils
|
stream-utils
|
||||||
--dependencies=None
|
--dependencies=None
|
||||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
--docker-repos=gcr.io/overleaf-ops
|
||||||
--env-add=
|
--env-add=
|
||||||
--env-pass-through=
|
--env-pass-through=
|
||||||
--esmock-loader=False
|
--esmock-loader=False
|
||||||
--is-library=True
|
--is-library=True
|
||||||
--node-version=22.17.0
|
--node-version=20.18.2
|
||||||
--public-repo=False
|
--public-repo=False
|
||||||
--script-version=4.7.0
|
--script-version=4.5.0
|
||||||
|
|
|
@ -145,24 +145,6 @@ class LoggerStream extends Transform {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
class MeteredStream extends Transform {
|
|
||||||
#Metrics
|
|
||||||
#metric
|
|
||||||
#labels
|
|
||||||
|
|
||||||
constructor(Metrics, metric, labels) {
|
|
||||||
super()
|
|
||||||
this.#Metrics = Metrics
|
|
||||||
this.#metric = metric
|
|
||||||
this.#labels = labels
|
|
||||||
}
|
|
||||||
|
|
||||||
_transform(chunk, encoding, callback) {
|
|
||||||
this.#Metrics.count(this.#metric, chunk.byteLength, 1, this.#labels)
|
|
||||||
callback(null, chunk)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Export our classes
|
// Export our classes
|
||||||
|
|
||||||
module.exports = {
|
module.exports = {
|
||||||
|
@ -171,7 +153,6 @@ module.exports = {
|
||||||
LoggerStream,
|
LoggerStream,
|
||||||
LimitedStream,
|
LimitedStream,
|
||||||
TimeoutStream,
|
TimeoutStream,
|
||||||
MeteredStream,
|
|
||||||
SizeExceededError,
|
SizeExceededError,
|
||||||
AbortError,
|
AbortError,
|
||||||
}
|
}
|
||||||
|
|
11281
package-lock.json
generated
11281
package-lock.json
generated
File diff suppressed because it is too large
Load diff
33
package.json
33
package.json
|
@ -8,8 +8,8 @@
|
||||||
"@types/chai": "^4.3.0",
|
"@types/chai": "^4.3.0",
|
||||||
"@types/chai-as-promised": "^7.1.8",
|
"@types/chai-as-promised": "^7.1.8",
|
||||||
"@types/mocha": "^10.0.6",
|
"@types/mocha": "^10.0.6",
|
||||||
"@typescript-eslint/eslint-plugin": "^8.30.1",
|
"@typescript-eslint/eslint-plugin": "^8.0.0",
|
||||||
"@typescript-eslint/parser": "^8.30.1",
|
"@typescript-eslint/parser": "^8.0.0",
|
||||||
"eslint": "^8.15.0",
|
"eslint": "^8.15.0",
|
||||||
"eslint-config-prettier": "^8.5.0",
|
"eslint-config-prettier": "^8.5.0",
|
||||||
"eslint-config-standard": "^17.0.0",
|
"eslint-config-standard": "^17.0.0",
|
||||||
|
@ -18,24 +18,28 @@
|
||||||
"eslint-plugin-cypress": "^2.15.1",
|
"eslint-plugin-cypress": "^2.15.1",
|
||||||
"eslint-plugin-import": "^2.26.0",
|
"eslint-plugin-import": "^2.26.0",
|
||||||
"eslint-plugin-mocha": "^10.1.0",
|
"eslint-plugin-mocha": "^10.1.0",
|
||||||
"eslint-plugin-n": "^15.7.0",
|
"eslint-plugin-node": "^11.1.0",
|
||||||
"eslint-plugin-prettier": "^4.0.0",
|
"eslint-plugin-prettier": "^4.0.0",
|
||||||
"eslint-plugin-promise": "^6.0.0",
|
"eslint-plugin-promise": "^6.0.0",
|
||||||
"eslint-plugin-unicorn": "^56.0.0",
|
"eslint-plugin-unicorn": "^56.0.0",
|
||||||
"prettier": "3.6.2",
|
"prettier": "3.3.3",
|
||||||
"typescript": "^5.8.3"
|
"typescript": "^5.5.4"
|
||||||
},
|
|
||||||
"engines": {
|
|
||||||
"npm": "11.4.2"
|
|
||||||
},
|
},
|
||||||
"overrides": {
|
"overrides": {
|
||||||
"swagger-tools@0.10.4": {
|
"cross-env": {
|
||||||
"path-to-regexp": "3.3.0",
|
"cross-spawn": "^7.0.6"
|
||||||
"body-parser": "1.20.3",
|
|
||||||
"multer": "2.0.1"
|
|
||||||
},
|
},
|
||||||
"request@2.88.2": {
|
"fetch-mock": {
|
||||||
"tough-cookie": "5.1.2"
|
"path-to-regexp": "3.3.0"
|
||||||
|
},
|
||||||
|
"google-gax": {
|
||||||
|
"protobufjs": "^7.2.5"
|
||||||
|
},
|
||||||
|
"swagger-tools": {
|
||||||
|
"body-parser": "1.20.3",
|
||||||
|
"multer": "1.4.5-lts.1",
|
||||||
|
"path-to-regexp": "3.3.0",
|
||||||
|
"qs": "6.13.0"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"scripts": {
|
"scripts": {
|
||||||
|
@ -51,7 +55,6 @@
|
||||||
"services/analytics",
|
"services/analytics",
|
||||||
"services/chat",
|
"services/chat",
|
||||||
"services/clsi",
|
"services/clsi",
|
||||||
"services/clsi-cache",
|
|
||||||
"services/clsi-perf",
|
"services/clsi-perf",
|
||||||
"services/contacts",
|
"services/contacts",
|
||||||
"services/docstore",
|
"services/docstore",
|
||||||
|
|
|
@ -1,23 +0,0 @@
|
||||||
diff --git a/node_modules/@node-saml/node-saml/lib/saml.js b/node_modules/@node-saml/node-saml/lib/saml.js
|
|
||||||
index fba15b9..a5778cb 100644
|
|
||||||
--- a/node_modules/@node-saml/node-saml/lib/saml.js
|
|
||||||
+++ b/node_modules/@node-saml/node-saml/lib/saml.js
|
|
||||||
@@ -336,7 +336,8 @@ class SAML {
|
|
||||||
const requestOrResponse = request || response;
|
|
||||||
(0, utility_1.assertRequired)(requestOrResponse, "either request or response is required");
|
|
||||||
let buffer;
|
|
||||||
- if (this.options.skipRequestCompression) {
|
|
||||||
+ // logout requestOrResponse must be compressed anyway
|
|
||||||
+ if (this.options.skipRequestCompression && operation !== "logout") {
|
|
||||||
buffer = Buffer.from(requestOrResponse, "utf8");
|
|
||||||
}
|
|
||||||
else {
|
|
||||||
@@ -495,7 +496,7 @@ class SAML {
|
|
||||||
try {
|
|
||||||
xml = Buffer.from(container.SAMLResponse, "base64").toString("utf8");
|
|
||||||
doc = await (0, xml_1.parseDomFromString)(xml);
|
|
||||||
- const inResponseToNodes = xml_1.xpath.selectAttributes(doc, "/*[local-name()='Response']/@InResponseTo");
|
|
||||||
+ const inResponseToNodes = xml_1.xpath.selectAttributes(doc, "/*[local-name()='Response' or local-name()='LogoutResponse']/@InResponseTo");
|
|
||||||
if (inResponseToNodes) {
|
|
||||||
inResponseTo = inResponseToNodes.length ? inResponseToNodes[0].nodeValue : null;
|
|
||||||
await this.validateInResponseTo(inResponseTo);
|
|
|
@ -1,64 +0,0 @@
|
||||||
diff --git a/node_modules/ldapauth-fork/lib/ldapauth.js b/node_modules/ldapauth-fork/lib/ldapauth.js
|
|
||||||
index 85ecf36a8b..a7d07e0f78 100644
|
|
||||||
--- a/node_modules/ldapauth-fork/lib/ldapauth.js
|
|
||||||
+++ b/node_modules/ldapauth-fork/lib/ldapauth.js
|
|
||||||
@@ -69,6 +69,7 @@ function LdapAuth(opts) {
|
|
||||||
this.opts.bindProperty || (this.opts.bindProperty = 'dn');
|
|
||||||
this.opts.groupSearchScope || (this.opts.groupSearchScope = 'sub');
|
|
||||||
this.opts.groupDnProperty || (this.opts.groupDnProperty = 'dn');
|
|
||||||
+ this.opts.tlsStarted = false;
|
|
||||||
|
|
||||||
EventEmitter.call(this);
|
|
||||||
|
|
||||||
@@ -108,21 +109,7 @@ function LdapAuth(opts) {
|
|
||||||
this._userClient.on('error', this._handleError.bind(this));
|
|
||||||
|
|
||||||
var self = this;
|
|
||||||
- if (this.opts.starttls) {
|
|
||||||
- // When starttls is enabled, this callback supplants the 'connect' callback
|
|
||||||
- this._adminClient.starttls(this.opts.tlsOptions, this._adminClient.controls, function(err) {
|
|
||||||
- if (err) {
|
|
||||||
- self._handleError(err);
|
|
||||||
- } else {
|
|
||||||
- self._onConnectAdmin();
|
|
||||||
- }
|
|
||||||
- });
|
|
||||||
- this._userClient.starttls(this.opts.tlsOptions, this._userClient.controls, function(err) {
|
|
||||||
- if (err) {
|
|
||||||
- self._handleError(err);
|
|
||||||
- }
|
|
||||||
- });
|
|
||||||
- } else if (opts.reconnect) {
|
|
||||||
+ if (opts.reconnect && !this.opts.starttls) {
|
|
||||||
this.once('_installReconnectListener', function() {
|
|
||||||
self.log && self.log.trace('install reconnect listener');
|
|
||||||
self._adminClient.on('connect', function() {
|
|
||||||
@@ -384,6 +371,28 @@ LdapAuth.prototype._findGroups = function(user, callback) {
|
|
||||||
*/
|
|
||||||
LdapAuth.prototype.authenticate = function(username, password, callback) {
|
|
||||||
var self = this;
|
|
||||||
+ if (this.opts.starttls && !this.opts.tlsStarted) {
|
|
||||||
+ // When starttls is enabled, this callback supplants the 'connect' callback
|
|
||||||
+ this._adminClient.starttls(this.opts.tlsOptions, this._adminClient.controls, function (err) {
|
|
||||||
+ if (err) {
|
|
||||||
+ self._handleError(err);
|
|
||||||
+ } else {
|
|
||||||
+ self._onConnectAdmin(function(){self._handleAuthenticate(username, password, callback);});
|
|
||||||
+ }
|
|
||||||
+ });
|
|
||||||
+ this._userClient.starttls(this.opts.tlsOptions, this._userClient.controls, function (err) {
|
|
||||||
+ if (err) {
|
|
||||||
+ self._handleError(err);
|
|
||||||
+ }
|
|
||||||
+ });
|
|
||||||
+ } else {
|
|
||||||
+ self._handleAuthenticate(username, password, callback);
|
|
||||||
+ }
|
|
||||||
+};
|
|
||||||
+
|
|
||||||
+LdapAuth.prototype._handleAuthenticate = function (username, password, callback) {
|
|
||||||
+ this.opts.tlsStarted = true;
|
|
||||||
+ var self = this;
|
|
||||||
|
|
||||||
if (typeof password === 'undefined' || password === null || password === '') {
|
|
||||||
return callback(new Error('no password given'));
|
|
|
@ -1,22 +0,0 @@
|
||||||
diff --git a/node_modules/pdfjs-dist/build/pdf.worker.mjs b/node_modules/pdfjs-dist/build/pdf.worker.mjs
|
|
||||||
index 6c5c6f1..bb6b7d1 100644
|
|
||||||
--- a/node_modules/pdfjs-dist/build/pdf.worker.mjs
|
|
||||||
+++ b/node_modules/pdfjs-dist/build/pdf.worker.mjs
|
|
||||||
@@ -1830,7 +1830,7 @@ async function __wbg_init(module_or_path) {
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (typeof module_or_path === 'undefined') {
|
|
||||||
- module_or_path = new URL('qcms_bg.wasm', import.meta.url);
|
|
||||||
+ module_or_path = new URL(/* webpackIgnore: true */ 'qcms_bg.wasm', import.meta.url);
|
|
||||||
}
|
|
||||||
const imports = __wbg_get_imports();
|
|
||||||
if (typeof module_or_path === 'string' || typeof Request === 'function' && module_or_path instanceof Request || typeof URL === 'function' && module_or_path instanceof URL) {
|
|
||||||
@@ -5358,7 +5358,7 @@ var OpenJPEG = (() => {
|
|
||||||
if (Module["locateFile"]) {
|
|
||||||
return locateFile("openjpeg.wasm");
|
|
||||||
}
|
|
||||||
- return new URL("openjpeg.wasm", import.meta.url).href;
|
|
||||||
+ return new URL(/* webpackIgnore: true */ "openjpeg.wasm", import.meta.url).href;
|
|
||||||
}
|
|
||||||
function getBinarySync(file) {
|
|
||||||
if (file == wasmBinaryFile && wasmBinary) {
|
|
|
@ -115,3 +115,9 @@ ENV LOG_LEVEL="info"
|
||||||
EXPOSE 80
|
EXPOSE 80
|
||||||
|
|
||||||
ENTRYPOINT ["/sbin/my_init"]
|
ENTRYPOINT ["/sbin/my_init"]
|
||||||
|
|
||||||
|
# Store the revision
|
||||||
|
# ------------------
|
||||||
|
# This should be the last step to optimize docker image caching.
|
||||||
|
ARG MONOREPO_REVISION
|
||||||
|
RUN echo "monorepo-server-ce,$MONOREPO_REVISION" > /var/www/revisions.txt
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
# Overleaf Base Image (sharelatex/sharelatex-base)
|
# Overleaf Base Image (sharelatex/sharelatex-base)
|
||||||
# --------------------------------------------------
|
# --------------------------------------------------
|
||||||
|
|
||||||
FROM phusion/baseimage:noble-1.0.2
|
FROM phusion/baseimage:noble-1.0.0
|
||||||
|
|
||||||
# Makes sure LuaTex cache is writable
|
# Makes sure LuaTex cache is writable
|
||||||
# -----------------------------------
|
# -----------------------------------
|
||||||
|
@ -10,7 +10,7 @@ ENV TEXMFVAR=/var/lib/overleaf/tmp/texmf-var
|
||||||
|
|
||||||
# Update to ensure dependencies are updated
|
# Update to ensure dependencies are updated
|
||||||
# ------------------------------------------
|
# ------------------------------------------
|
||||||
ENV REBUILT_AFTER="2025-05-19"
|
ENV REBUILT_AFTER="2025-03-27"
|
||||||
|
|
||||||
# Install dependencies
|
# Install dependencies
|
||||||
# --------------------
|
# --------------------
|
||||||
|
@ -30,7 +30,7 @@ RUN --mount=type=cache,target=/var/cache/apt,sharing=locked \
|
||||||
# install Node.js https://github.com/nodesource/distributions#nodejs
|
# install Node.js https://github.com/nodesource/distributions#nodejs
|
||||||
&& mkdir -p /etc/apt/keyrings \
|
&& mkdir -p /etc/apt/keyrings \
|
||||||
&& curl -fsSL https://deb.nodesource.com/gpgkey/nodesource-repo.gpg.key | gpg --dearmor -o /etc/apt/keyrings/nodesource.gpg \
|
&& curl -fsSL https://deb.nodesource.com/gpgkey/nodesource-repo.gpg.key | gpg --dearmor -o /etc/apt/keyrings/nodesource.gpg \
|
||||||
&& echo "deb [signed-by=/etc/apt/keyrings/nodesource.gpg] https://deb.nodesource.com/node_22.x nodistro main" | tee /etc/apt/sources.list.d/nodesource.list \
|
&& echo "deb [signed-by=/etc/apt/keyrings/nodesource.gpg] https://deb.nodesource.com/node_20.x nodistro main" | tee /etc/apt/sources.list.d/nodesource.list \
|
||||||
&& apt-get update \
|
&& apt-get update \
|
||||||
&& apt-get install -y nodejs \
|
&& apt-get install -y nodejs \
|
||||||
\
|
\
|
||||||
|
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue