mirror of
https://github.com/yu-i-i/overleaf-cep.git
synced 2025-07-28 02:00:07 +02:00
Compare commits
No commits in common. "ext-ce" and "v5.3.1" have entirely different histories.
3101 changed files with 130001 additions and 154787 deletions
|
@ -1,19 +1,10 @@
|
|||
---
|
||||
name: Bug report
|
||||
about: Report a bug
|
||||
title: ''
|
||||
labels: type:bug
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
<!--
|
||||
|
||||
Note: If you are using www.overleaf.com and have a problem,
|
||||
Note: If you are using www.overleaf.com and have a problem,
|
||||
or if you would like to request a new feature please contact
|
||||
the support team at support@overleaf.com
|
||||
|
||||
This form should only be used to report bugs in the
|
||||
|
||||
This form should only be used to report bugs in the
|
||||
Community Edition release of Overleaf.
|
||||
|
||||
-->
|
63
README.md
63
README.md
|
@ -14,52 +14,39 @@
|
|||
<a href="#license">License</a>
|
||||
</p>
|
||||
|
||||
<img src="doc/screenshot.png" alt="A screenshot of a project being edited in Overleaf Extended Community Edition">
|
||||
<img src="doc/screenshot.png" alt="A screenshot of a project being edited in Overleaf Community Edition">
|
||||
<p align="center">
|
||||
Figure 1: A screenshot of a project being edited in Overleaf Extended Community Edition.
|
||||
Figure 1: A screenshot of a project being edited in Overleaf Community Edition.
|
||||
</p>
|
||||
|
||||
## Community Edition
|
||||
|
||||
[Overleaf](https://www.overleaf.com) is an open-source online real-time collaborative LaTeX editor. Overleaf runs a hosted version at [www.overleaf.com](https://www.overleaf.com), but you can also run your own local version, and contribute to the development of Overleaf.
|
||||
|
||||
## Extended Community Edition
|
||||
|
||||
The present "extended" version of Overleaf CE includes:
|
||||
|
||||
- Template Gallery
|
||||
- Sandboxed Compiles with TeX Live image selection
|
||||
- LDAP authentication
|
||||
- SAML authentication
|
||||
- OpenID Connect authentication
|
||||
- Real-time track changes and comments
|
||||
- Autocomplete of reference keys
|
||||
- Symbol Palette
|
||||
- "From External URL" feature
|
||||
|
||||
> [!CAUTION]
|
||||
> Overleaf Community Edition is intended for use in environments where **all** users are trusted. Community Edition is **not** appropriate for scenarios where isolation of users is required due to Sandbox Compiles not being available. When not using Sandboxed Compiles, users have full read and write access to the `sharelatex` container resources (filesystem, network, environment variables) when running LaTeX compiles.
|
||||
Therefore, in any environment where not all users can be fully trusted, it is strongly recommended to enable the Sandboxed Compiles feature available in the Extended Community Edition.
|
||||
|
||||
For more information on Sandbox Compiles check out Overleaf [documentation](https://docs.overleaf.com/on-premises/configuration/overleaf-toolkit/server-pro-only-configuration/sandboxed-compiles).
|
||||
[Overleaf](https://www.overleaf.com) is an open-source online real-time collaborative LaTeX editor. We run a hosted version at [www.overleaf.com](https://www.overleaf.com), but you can also run your own local version, and contribute to the development of Overleaf.
|
||||
|
||||
## Enterprise
|
||||
|
||||
If you want help installing and maintaining Overleaf in your lab or workplace, Overleaf offers an officially supported version called [Overleaf Server Pro](https://www.overleaf.com/for/enterprises).
|
||||
If you want help installing and maintaining Overleaf in your lab or workplace, we offer an officially supported version called [Overleaf Server Pro](https://www.overleaf.com/for/enterprises). It also includes more features for security (SSO with LDAP or SAML), administration and collaboration (e.g. tracked changes). [Find out more!](https://www.overleaf.com/for/enterprises)
|
||||
|
||||
## Keeping up to date
|
||||
|
||||
Sign up to the [mailing list](https://mailchi.mp/overleaf.com/community-edition-and-server-pro) to get updates on Overleaf releases and development.
|
||||
|
||||
## Installation
|
||||
|
||||
Detailed installation instructions can be found in the [Overleaf Toolkit](https://github.com/overleaf/toolkit/).
|
||||
Configuration details and release history for the Extended Community Edition can be found on the [Extended CE Wiki Page](https://github.com/yu-i-i/overleaf-cep/wiki).
|
||||
We have detailed installation instructions in the [Overleaf Toolkit](https://github.com/overleaf/toolkit/).
|
||||
|
||||
## Upgrading
|
||||
|
||||
If you are upgrading from a previous version of Overleaf, please see the [Release Notes section on the Wiki](https://github.com/overleaf/overleaf/wiki#release-notes) for all of the versions between your current version and the version you are upgrading to.
|
||||
|
||||
## Overleaf Docker Image
|
||||
|
||||
This repo contains two dockerfiles, [`Dockerfile-base`](server-ce/Dockerfile-base), which builds the
|
||||
`sharelatex/sharelatex-base:ext-ce` image, and [`Dockerfile`](server-ce/Dockerfile) which builds the
|
||||
`sharelatex/sharelatex:ext-ce` image.
|
||||
`sharelatex/sharelatex-base` image, and [`Dockerfile`](server-ce/Dockerfile) which builds the
|
||||
`sharelatex/sharelatex` (or "community") image.
|
||||
|
||||
The Base image generally contains the basic dependencies like `wget`, plus `texlive`.
|
||||
This is split out because it's a pretty heavy set of
|
||||
We split this out because it's a pretty heavy set of
|
||||
dependencies, and it's nice to not have to rebuild all of that every time.
|
||||
|
||||
The `sharelatex/sharelatex` image extends the base image and adds the actual Overleaf code
|
||||
|
@ -67,19 +54,23 @@ and services.
|
|||
|
||||
Use `make build-base` and `make build-community` from `server-ce/` to build these images.
|
||||
|
||||
The [Phusion base-image](https://github.com/phusion/baseimage-docker)
|
||||
(which is extended by the `base` image) provides a VM-like container
|
||||
We use the [Phusion base-image](https://github.com/phusion/baseimage-docker)
|
||||
(which is extended by our `base` image) to provide us with a VM-like container
|
||||
in which to run the Overleaf services. Baseimage uses the `runit` service
|
||||
manager to manage services, and init scripts from the `server-ce/runit`
|
||||
folder are added.
|
||||
manager to manage services, and we add our init-scripts from the `server-ce/runit`
|
||||
folder.
|
||||
|
||||
|
||||
## Contributing
|
||||
|
||||
Please see the [CONTRIBUTING](CONTRIBUTING.md) file for information on contributing to the development of Overleaf.
|
||||
|
||||
## Authors
|
||||
|
||||
[The Overleaf Team](https://www.overleaf.com/about)
|
||||
[yu-i-i](https://github.com/yu-i-i/overleaf-cep) — Extensions for CE unless otherwise noted
|
||||
[The Overleaf Team](https://www.overleaf.com/about)
|
||||
|
||||
## License
|
||||
|
||||
The code in this repository is released under the GNU AFFERO GENERAL PUBLIC LICENSE, version 3. A copy can be found in the [`LICENSE`](LICENSE) file.
|
||||
|
||||
Copyright (c) Overleaf, 2014-2025.
|
||||
Copyright (c) Overleaf, 2014-2024.
|
||||
|
|
|
@ -11,6 +11,12 @@ bin/build
|
|||
> [!NOTE]
|
||||
> If Docker is running out of RAM while building the services in parallel, create a `.env` file in this directory containing `COMPOSE_PARALLEL_LIMIT=1`.
|
||||
|
||||
Next, initialize the database:
|
||||
|
||||
```shell
|
||||
bin/init
|
||||
```
|
||||
|
||||
Then start the services:
|
||||
|
||||
```shell
|
||||
|
@ -42,7 +48,7 @@ To do this, use the included `bin/dev` script:
|
|||
bin/dev
|
||||
```
|
||||
|
||||
This will start all services using `node --watch`, which will automatically monitor the code and restart the services as necessary.
|
||||
This will start all services using `nodemon`, which will automatically monitor the code and restart the services as necessary.
|
||||
|
||||
To improve performance, you can start only a subset of the services in development mode by providing a space-separated list to the `bin/dev` script:
|
||||
|
||||
|
@ -77,7 +83,6 @@ each service:
|
|||
| `filestore` | 9235 |
|
||||
| `notifications` | 9236 |
|
||||
| `real-time` | 9237 |
|
||||
| `references` | 9238 |
|
||||
| `history-v1` | 9239 |
|
||||
| `project-history` | 9240 |
|
||||
|
||||
|
|
6
develop/bin/init
Executable file
6
develop/bin/init
Executable file
|
@ -0,0 +1,6 @@
|
|||
#!/usr/bin/env bash
|
||||
|
||||
docker compose up --detach mongo
|
||||
curl --max-time 10 --retry 5 --retry-delay 5 --retry-all-errors --silent --output /dev/null localhost:27017
|
||||
docker compose exec mongo mongosh --eval "rs.initiate({ _id: 'overleaf', members: [{ _id: 0, host: 'mongo:27017' }] })"
|
||||
docker compose down mongo
|
|
@ -6,18 +6,14 @@ DOCUMENT_UPDATER_HOST=document-updater
|
|||
FILESTORE_HOST=filestore
|
||||
GRACEFUL_SHUTDOWN_DELAY_SECONDS=0
|
||||
HISTORY_V1_HOST=history-v1
|
||||
HISTORY_REDIS_HOST=redis
|
||||
LISTEN_ADDRESS=0.0.0.0
|
||||
MONGO_HOST=mongo
|
||||
MONGO_URL=mongodb://mongo/sharelatex?directConnection=true
|
||||
NOTIFICATIONS_HOST=notifications
|
||||
PROJECT_HISTORY_HOST=project-history
|
||||
QUEUES_REDIS_HOST=redis
|
||||
REALTIME_HOST=real-time
|
||||
REDIS_HOST=redis
|
||||
REFERENCES_HOST=references
|
||||
SESSION_SECRET=foo
|
||||
V1_HISTORY_HOST=history-v1
|
||||
WEBPACK_HOST=webpack
|
||||
WEB_API_PASSWORD=overleaf
|
||||
WEB_API_USER=overleaf
|
||||
|
|
|
@ -112,19 +112,8 @@ services:
|
|||
- ../services/real-time/app.js:/overleaf/services/real-time/app.js
|
||||
- ../services/real-time/config:/overleaf/services/real-time/config
|
||||
|
||||
references:
|
||||
command: ["node", "--watch", "app.js"]
|
||||
environment:
|
||||
- NODE_OPTIONS=--inspect=0.0.0.0:9229
|
||||
ports:
|
||||
- "127.0.0.1:9238:9229"
|
||||
volumes:
|
||||
- ../services/references/app:/overleaf/services/references/app
|
||||
- ../services/references/config:/overleaf/services/references/config
|
||||
- ../services/references/app.js:/overleaf/services/references/app.js
|
||||
|
||||
web:
|
||||
command: ["node", "--watch", "app.mjs", "--watch-locales"]
|
||||
command: ["node", "--watch", "app.js", "--watch-locales"]
|
||||
environment:
|
||||
- NODE_OPTIONS=--inspect=0.0.0.0:9229
|
||||
ports:
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
volumes:
|
||||
clsi-cache:
|
||||
clsi-output:
|
||||
filestore-public-files:
|
||||
filestore-template-files:
|
||||
filestore-uploads:
|
||||
|
@ -25,16 +26,15 @@ services:
|
|||
env_file:
|
||||
- dev.env
|
||||
environment:
|
||||
- DOCKER_RUNNER=true
|
||||
- TEXLIVE_IMAGE=texlive-full # docker build texlive -t texlive-full
|
||||
- SANDBOXED_COMPILES=true
|
||||
- SANDBOXED_COMPILES_HOST_DIR_COMPILES=${PWD}/compiles
|
||||
- SANDBOXED_COMPILES_HOST_DIR_OUTPUT=${PWD}/output
|
||||
- COMPILES_HOST_DIR=${PWD}/compiles
|
||||
user: root
|
||||
volumes:
|
||||
- ${PWD}/compiles:/overleaf/services/clsi/compiles
|
||||
- ${PWD}/output:/overleaf/services/clsi/output
|
||||
- ${DOCKER_SOCKET_PATH:-/var/run/docker.sock}:/var/run/docker.sock
|
||||
- clsi-cache:/overleaf/services/clsi/cache
|
||||
- clsi-output:/overleaf/services/clsi/output
|
||||
|
||||
contacts:
|
||||
build:
|
||||
|
@ -88,20 +88,12 @@ services:
|
|||
- history-v1-buckets:/buckets
|
||||
|
||||
mongo:
|
||||
image: mongo:6.0
|
||||
image: mongo:5
|
||||
command: --replSet overleaf
|
||||
ports:
|
||||
- "127.0.0.1:27017:27017" # for debugging
|
||||
volumes:
|
||||
- mongo-data:/data/db
|
||||
- ../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
environment:
|
||||
MONGO_INITDB_DATABASE: sharelatex
|
||||
extra_hosts:
|
||||
# Required when using the automatic database setup for initializing the
|
||||
# replica set. This override is not needed when running the setup after
|
||||
# starting up mongo.
|
||||
- mongo:127.0.0.1
|
||||
|
||||
notifications:
|
||||
build:
|
||||
|
@ -123,7 +115,7 @@ services:
|
|||
dockerfile: services/real-time/Dockerfile
|
||||
env_file:
|
||||
- dev.env
|
||||
|
||||
|
||||
redis:
|
||||
image: redis:5
|
||||
ports:
|
||||
|
@ -131,13 +123,6 @@ services:
|
|||
volumes:
|
||||
- redis-data:/data
|
||||
|
||||
references:
|
||||
build:
|
||||
context: ..
|
||||
dockerfile: services/references/Dockerfile
|
||||
env_file:
|
||||
- dev.env
|
||||
|
||||
web:
|
||||
build:
|
||||
context: ..
|
||||
|
@ -147,7 +132,7 @@ services:
|
|||
- dev.env
|
||||
environment:
|
||||
- APP_NAME=Overleaf Community Edition
|
||||
- ENABLED_LINKED_FILE_TYPES=project_file,project_output_file,url
|
||||
- ENABLED_LINKED_FILE_TYPES=project_file,project_output_file
|
||||
- EMAIL_CONFIRMATION_DISABLED=true
|
||||
- NODE_ENV=development
|
||||
- OVERLEAF_ALLOW_PUBLIC_ACCESS=true
|
||||
|
@ -168,7 +153,6 @@ services:
|
|||
- notifications
|
||||
- project-history
|
||||
- real-time
|
||||
- references
|
||||
|
||||
webpack:
|
||||
build:
|
||||
|
|
BIN
doc/logo.png
BIN
doc/logo.png
Binary file not shown.
Before Width: | Height: | Size: 13 KiB After Width: | Height: | Size: 71 KiB |
Binary file not shown.
Before Width: | Height: | Size: 1 MiB After Width: | Height: | Size: 587 KiB |
|
@ -32,7 +32,7 @@ services:
|
|||
OVERLEAF_REDIS_HOST: redis
|
||||
REDIS_HOST: redis
|
||||
|
||||
ENABLED_LINKED_FILE_TYPES: 'project_file,project_output_file,url'
|
||||
ENABLED_LINKED_FILE_TYPES: 'project_file,project_output_file'
|
||||
|
||||
# Enables Thumbnail generation using ImageMagick
|
||||
ENABLE_CONVERSIONS: 'true'
|
||||
|
@ -40,6 +40,10 @@ services:
|
|||
# Disables email confirmation requirement
|
||||
EMAIL_CONFIRMATION_DISABLED: 'true'
|
||||
|
||||
# temporary fix for LuaLaTex compiles
|
||||
# see https://github.com/overleaf/overleaf/issues/695
|
||||
TEXMFVAR: /var/lib/overleaf/tmp/texmf-var
|
||||
|
||||
## Set for SSL via nginx-proxy
|
||||
#VIRTUAL_HOST: 103.112.212.22
|
||||
|
||||
|
@ -73,19 +77,11 @@ services:
|
|||
## Server Pro ##
|
||||
################
|
||||
|
||||
## The Community Edition is intended for use in environments where all users are trusted and is not appropriate for
|
||||
## scenarios where isolation of users is required. Sandboxed Compiles are not available in the Community Edition,
|
||||
## so the following environment variables must be commented out to avoid compile issues.
|
||||
##
|
||||
## Sandboxed Compiles: https://docs.overleaf.com/on-premises/configuration/overleaf-toolkit/server-pro-only-configuration/sandboxed-compiles
|
||||
## Sandboxed Compiles: https://github.com/overleaf/overleaf/wiki/Server-Pro:-Sandboxed-Compiles
|
||||
SANDBOXED_COMPILES: 'true'
|
||||
### Bind-mount source for /var/lib/overleaf/data/compiles inside the container.
|
||||
SANDBOXED_COMPILES_HOST_DIR_COMPILES: '/home/user/sharelatex_data/data/compiles'
|
||||
### Bind-mount source for /var/lib/overleaf/data/output inside the container.
|
||||
SANDBOXED_COMPILES_HOST_DIR_OUTPUT: '/home/user/sharelatex_data/data/output'
|
||||
### Backwards compatibility (before Server Pro 5.5)
|
||||
DOCKER_RUNNER: 'true'
|
||||
SANDBOXED_COMPILES_SIBLING_CONTAINERS: 'true'
|
||||
### Bind-mount source for /var/lib/overleaf/data/compiles inside the container.
|
||||
SANDBOXED_COMPILES_HOST_DIR: '/home/user/sharelatex_data/data/compiles'
|
||||
|
||||
## Works with test LDAP server shown at bottom of docker compose
|
||||
# OVERLEAF_LDAP_URL: 'ldap://ldap:389'
|
||||
|
@ -106,12 +102,12 @@ services:
|
|||
|
||||
mongo:
|
||||
restart: always
|
||||
image: mongo:6.0
|
||||
image: mongo:5.0
|
||||
container_name: mongo
|
||||
command: '--replSet overleaf'
|
||||
volumes:
|
||||
- ~/mongo_data:/data/db
|
||||
- ./bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
- ./mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
environment:
|
||||
MONGO_INITDB_DATABASE: sharelatex
|
||||
extra_hosts:
|
||||
|
@ -119,7 +115,7 @@ services:
|
|||
# This override is not needed when running the setup after starting up mongo.
|
||||
- mongo:127.0.0.1
|
||||
healthcheck:
|
||||
test: echo 'db.stats().ok' | mongosh localhost:27017/test --quiet
|
||||
test: echo 'db.stats().ok' | mongo localhost:27017/test --quiet
|
||||
interval: 10s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
|
|
1
libraries/access-token-encryptor/.dockerignore
Normal file
1
libraries/access-token-encryptor/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
46
libraries/access-token-encryptor/.gitignore
vendored
Normal file
46
libraries/access-token-encryptor/.gitignore
vendored
Normal file
|
@ -0,0 +1,46 @@
|
|||
compileFolder
|
||||
|
||||
Compiled source #
|
||||
###################
|
||||
*.com
|
||||
*.class
|
||||
*.dll
|
||||
*.exe
|
||||
*.o
|
||||
*.so
|
||||
|
||||
# Packages #
|
||||
############
|
||||
# it's better to unpack these files and commit the raw source
|
||||
# git has its own built in compression methods
|
||||
*.7z
|
||||
*.dmg
|
||||
*.gz
|
||||
*.iso
|
||||
*.jar
|
||||
*.rar
|
||||
*.tar
|
||||
*.zip
|
||||
|
||||
# Logs and databases #
|
||||
######################
|
||||
*.log
|
||||
*.sql
|
||||
*.sqlite
|
||||
|
||||
# OS generated files #
|
||||
######################
|
||||
.DS_Store?
|
||||
ehthumbs.db
|
||||
Icon?
|
||||
Thumbs.db
|
||||
|
||||
/node_modules/*
|
||||
data/*/*
|
||||
|
||||
**.swp
|
||||
|
||||
/log.json
|
||||
hash_folder
|
||||
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
access-token-encryptor
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -21,7 +21,7 @@
|
|||
"devDependencies": {
|
||||
"chai": "^4.3.6",
|
||||
"chai-as-promised": "^7.1.1",
|
||||
"mocha": "^11.1.0",
|
||||
"mocha": "^10.2.0",
|
||||
"sandboxed-module": "^2.0.4",
|
||||
"typescript": "^5.0.4"
|
||||
}
|
||||
|
|
1
libraries/fetch-utils/.dockerignore
Normal file
1
libraries/fetch-utils/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
3
libraries/fetch-utils/.gitignore
vendored
Normal file
3
libraries/fetch-utils/.gitignore
vendored
Normal file
|
@ -0,0 +1,3 @@
|
|||
|
||||
# managed by monorepo$ bin/update_build_scripts
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
fetch-utils
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -23,11 +23,11 @@ async function fetchJson(url, opts = {}) {
|
|||
}
|
||||
|
||||
async function fetchJsonWithResponse(url, opts = {}) {
|
||||
const { fetchOpts, detachSignal } = parseOpts(opts)
|
||||
const { fetchOpts } = parseOpts(opts)
|
||||
fetchOpts.headers = fetchOpts.headers ?? {}
|
||||
fetchOpts.headers.Accept = fetchOpts.headers.Accept ?? 'application/json'
|
||||
|
||||
const response = await performRequest(url, fetchOpts, detachSignal)
|
||||
const response = await performRequest(url, fetchOpts)
|
||||
if (!response.ok) {
|
||||
const body = await maybeGetResponseBody(response)
|
||||
throw new RequestFailedError(url, opts, response, body)
|
||||
|
@ -53,8 +53,8 @@ async function fetchStream(url, opts = {}) {
|
|||
}
|
||||
|
||||
async function fetchStreamWithResponse(url, opts = {}) {
|
||||
const { fetchOpts, abortController, detachSignal } = parseOpts(opts)
|
||||
const response = await performRequest(url, fetchOpts, detachSignal)
|
||||
const { fetchOpts, abortController } = parseOpts(opts)
|
||||
const response = await performRequest(url, fetchOpts)
|
||||
|
||||
if (!response.ok) {
|
||||
const body = await maybeGetResponseBody(response)
|
||||
|
@ -76,8 +76,8 @@ async function fetchStreamWithResponse(url, opts = {}) {
|
|||
* @throws {RequestFailedError} if the response has a failure status code
|
||||
*/
|
||||
async function fetchNothing(url, opts = {}) {
|
||||
const { fetchOpts, detachSignal } = parseOpts(opts)
|
||||
const response = await performRequest(url, fetchOpts, detachSignal)
|
||||
const { fetchOpts } = parseOpts(opts)
|
||||
const response = await performRequest(url, fetchOpts)
|
||||
if (!response.ok) {
|
||||
const body = await maybeGetResponseBody(response)
|
||||
throw new RequestFailedError(url, opts, response, body)
|
||||
|
@ -95,22 +95,9 @@ async function fetchNothing(url, opts = {}) {
|
|||
* @throws {RequestFailedError} if the response has a non redirect status code or missing Location header
|
||||
*/
|
||||
async function fetchRedirect(url, opts = {}) {
|
||||
const { location } = await fetchRedirectWithResponse(url, opts)
|
||||
return location
|
||||
}
|
||||
|
||||
/**
|
||||
* Make a request and extract the redirect from the response.
|
||||
*
|
||||
* @param {string | URL} url - request URL
|
||||
* @param {object} opts - fetch options
|
||||
* @return {Promise<{location: string, response: Response}>}
|
||||
* @throws {RequestFailedError} if the response has a non redirect status code or missing Location header
|
||||
*/
|
||||
async function fetchRedirectWithResponse(url, opts = {}) {
|
||||
const { fetchOpts, detachSignal } = parseOpts(opts)
|
||||
const { fetchOpts } = parseOpts(opts)
|
||||
fetchOpts.redirect = 'manual'
|
||||
const response = await performRequest(url, fetchOpts, detachSignal)
|
||||
const response = await performRequest(url, fetchOpts)
|
||||
if (response.status < 300 || response.status >= 400) {
|
||||
const body = await maybeGetResponseBody(response)
|
||||
throw new RequestFailedError(url, opts, response, body)
|
||||
|
@ -125,7 +112,7 @@ async function fetchRedirectWithResponse(url, opts = {}) {
|
|||
)
|
||||
}
|
||||
await discardResponseBody(response)
|
||||
return { location, response }
|
||||
return location
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -142,8 +129,8 @@ async function fetchString(url, opts = {}) {
|
|||
}
|
||||
|
||||
async function fetchStringWithResponse(url, opts = {}) {
|
||||
const { fetchOpts, detachSignal } = parseOpts(opts)
|
||||
const response = await performRequest(url, fetchOpts, detachSignal)
|
||||
const { fetchOpts } = parseOpts(opts)
|
||||
const response = await performRequest(url, fetchOpts)
|
||||
if (!response.ok) {
|
||||
const body = await maybeGetResponseBody(response)
|
||||
throw new RequestFailedError(url, opts, response, body)
|
||||
|
@ -178,14 +165,13 @@ function parseOpts(opts) {
|
|||
|
||||
const abortController = new AbortController()
|
||||
fetchOpts.signal = abortController.signal
|
||||
let detachSignal = () => {}
|
||||
if (opts.signal) {
|
||||
detachSignal = abortOnSignal(abortController, opts.signal)
|
||||
abortOnSignal(abortController, opts.signal)
|
||||
}
|
||||
if (opts.body instanceof Readable) {
|
||||
abortOnDestroyedRequest(abortController, fetchOpts.body)
|
||||
}
|
||||
return { fetchOpts, abortController, detachSignal }
|
||||
return { fetchOpts, abortController }
|
||||
}
|
||||
|
||||
function setupJsonBody(fetchOpts, json) {
|
||||
|
@ -209,9 +195,6 @@ function abortOnSignal(abortController, signal) {
|
|||
abortController.abort(signal.reason)
|
||||
}
|
||||
signal.addEventListener('abort', listener)
|
||||
return () => {
|
||||
signal.removeEventListener('abort', listener)
|
||||
}
|
||||
}
|
||||
|
||||
function abortOnDestroyedRequest(abortController, stream) {
|
||||
|
@ -230,12 +213,11 @@ function abortOnDestroyedResponse(abortController, response) {
|
|||
})
|
||||
}
|
||||
|
||||
async function performRequest(url, fetchOpts, detachSignal) {
|
||||
async function performRequest(url, fetchOpts) {
|
||||
let response
|
||||
try {
|
||||
response = await fetch(url, fetchOpts)
|
||||
} catch (err) {
|
||||
detachSignal()
|
||||
if (fetchOpts.body instanceof Readable) {
|
||||
fetchOpts.body.destroy()
|
||||
}
|
||||
|
@ -244,7 +226,6 @@ async function performRequest(url, fetchOpts, detachSignal) {
|
|||
method: fetchOpts.method ?? 'GET',
|
||||
})
|
||||
}
|
||||
response.body.on('close', detachSignal)
|
||||
if (fetchOpts.body instanceof Readable) {
|
||||
response.body.on('close', () => {
|
||||
if (!fetchOpts.body.readableEnded) {
|
||||
|
@ -316,7 +297,6 @@ module.exports = {
|
|||
fetchStreamWithResponse,
|
||||
fetchNothing,
|
||||
fetchRedirect,
|
||||
fetchRedirectWithResponse,
|
||||
fetchString,
|
||||
fetchStringWithResponse,
|
||||
RequestFailedError,
|
||||
|
|
|
@ -20,8 +20,8 @@
|
|||
"body-parser": "^1.20.3",
|
||||
"chai": "^4.3.6",
|
||||
"chai-as-promised": "^7.1.1",
|
||||
"express": "^4.21.2",
|
||||
"mocha": "^11.1.0",
|
||||
"express": "^4.21.0",
|
||||
"mocha": "^10.2.0",
|
||||
"typescript": "^5.0.4"
|
||||
},
|
||||
"dependencies": {
|
||||
|
|
|
@ -1,9 +1,6 @@
|
|||
const { expect } = require('chai')
|
||||
const fs = require('node:fs')
|
||||
const events = require('node:events')
|
||||
const { FetchError, AbortError } = require('node-fetch')
|
||||
const { Readable } = require('node:stream')
|
||||
const { pipeline } = require('node:stream/promises')
|
||||
const { once } = require('node:events')
|
||||
const { TestServer } = require('./helpers/TestServer')
|
||||
const selfsigned = require('selfsigned')
|
||||
|
@ -206,31 +203,6 @@ describe('fetch-utils', function () {
|
|||
).to.be.rejectedWith(AbortError)
|
||||
expect(stream.destroyed).to.be.true
|
||||
})
|
||||
|
||||
it('detaches from signal on success', async function () {
|
||||
const signal = AbortSignal.timeout(10_000)
|
||||
for (let i = 0; i < 20; i++) {
|
||||
const s = await fetchStream(this.url('/hello'), { signal })
|
||||
expect(events.getEventListeners(signal, 'abort')).to.have.length(1)
|
||||
await pipeline(s, fs.createWriteStream('/dev/null'))
|
||||
expect(events.getEventListeners(signal, 'abort')).to.have.length(0)
|
||||
}
|
||||
})
|
||||
|
||||
it('detaches from signal on error', async function () {
|
||||
const signal = AbortSignal.timeout(10_000)
|
||||
for (let i = 0; i < 20; i++) {
|
||||
try {
|
||||
await fetchStream(this.url('/500'), { signal })
|
||||
} catch (err) {
|
||||
if (err instanceof RequestFailedError && err.response.status === 500)
|
||||
continue
|
||||
throw err
|
||||
} finally {
|
||||
expect(events.getEventListeners(signal, 'abort')).to.have.length(0)
|
||||
}
|
||||
}
|
||||
})
|
||||
})
|
||||
|
||||
describe('fetchNothing', function () {
|
||||
|
@ -419,16 +391,9 @@ async function* infiniteIterator() {
|
|||
async function abortOnceReceived(func, server) {
|
||||
const controller = new AbortController()
|
||||
const promise = func(controller.signal)
|
||||
expect(events.getEventListeners(controller.signal, 'abort')).to.have.length(1)
|
||||
await once(server.events, 'request-received')
|
||||
controller.abort()
|
||||
try {
|
||||
return await promise
|
||||
} finally {
|
||||
expect(events.getEventListeners(controller.signal, 'abort')).to.have.length(
|
||||
0
|
||||
)
|
||||
}
|
||||
return await promise
|
||||
}
|
||||
|
||||
async function expectRequestAborted(req) {
|
||||
|
|
1
libraries/logger/.dockerignore
Normal file
1
libraries/logger/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
3
libraries/logger/.gitignore
vendored
Normal file
3
libraries/logger/.gitignore
vendored
Normal file
|
@ -0,0 +1,3 @@
|
|||
node_modules
|
||||
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
logger
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -14,10 +14,8 @@ const LoggingManager = {
|
|||
initialize(name) {
|
||||
this.isProduction =
|
||||
(process.env.NODE_ENV || '').toLowerCase() === 'production'
|
||||
const isTest = (process.env.NODE_ENV || '').toLowerCase() === 'test'
|
||||
this.defaultLevel =
|
||||
process.env.LOG_LEVEL ||
|
||||
(this.isProduction ? 'info' : isTest ? 'fatal' : 'debug')
|
||||
process.env.LOG_LEVEL || (this.isProduction ? 'info' : 'debug')
|
||||
this.loggerName = name
|
||||
this.logger = bunyan.createLogger({
|
||||
name,
|
||||
|
|
|
@ -27,7 +27,7 @@
|
|||
},
|
||||
"devDependencies": {
|
||||
"chai": "^4.3.6",
|
||||
"mocha": "^11.1.0",
|
||||
"mocha": "^10.2.0",
|
||||
"sandboxed-module": "^2.0.4",
|
||||
"sinon": "^9.2.4",
|
||||
"sinon-chai": "^3.7.0",
|
||||
|
|
1
libraries/metrics/.dockerignore
Normal file
1
libraries/metrics/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
3
libraries/metrics/.gitignore
vendored
Normal file
3
libraries/metrics/.gitignore
vendored
Normal file
|
@ -0,0 +1,3 @@
|
|||
node_modules
|
||||
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
metrics
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -5,8 +5,6 @@
|
|||
* before any other module to support code instrumentation.
|
||||
*/
|
||||
|
||||
const metricsModuleImportStartTime = performance.now()
|
||||
|
||||
const APP_NAME = process.env.METRICS_APP_NAME || 'unknown'
|
||||
const BUILD_VERSION = process.env.BUILD_VERSION
|
||||
const ENABLE_PROFILE_AGENT = process.env.ENABLE_PROFILE_AGENT === 'true'
|
||||
|
@ -105,5 +103,3 @@ function recordProcessStart() {
|
|||
const metrics = require('.')
|
||||
metrics.inc('process_startup')
|
||||
}
|
||||
|
||||
module.exports = { metricsModuleImportStartTime }
|
||||
|
|
|
@ -9,7 +9,7 @@
|
|||
"main": "index.js",
|
||||
"dependencies": {
|
||||
"@google-cloud/opentelemetry-cloud-trace-exporter": "^2.1.0",
|
||||
"@google-cloud/profiler": "^6.0.3",
|
||||
"@google-cloud/profiler": "^6.0.0",
|
||||
"@opentelemetry/api": "^1.4.1",
|
||||
"@opentelemetry/auto-instrumentations-node": "^0.39.1",
|
||||
"@opentelemetry/exporter-trace-otlp-http": "^0.41.2",
|
||||
|
@ -23,7 +23,7 @@
|
|||
"devDependencies": {
|
||||
"bunyan": "^1.0.0",
|
||||
"chai": "^4.3.6",
|
||||
"mocha": "^11.1.0",
|
||||
"mocha": "^10.2.0",
|
||||
"sandboxed-module": "^2.0.4",
|
||||
"sinon": "^9.2.4",
|
||||
"typescript": "^5.0.4"
|
||||
|
|
1
libraries/mongo-utils/.dockerignore
Normal file
1
libraries/mongo-utils/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
3
libraries/mongo-utils/.gitignore
vendored
Normal file
3
libraries/mongo-utils/.gitignore
vendored
Normal file
|
@ -0,0 +1,3 @@
|
|||
|
||||
# managed by monorepo$ bin/update_build_scripts
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -16,7 +16,6 @@ let VERBOSE_LOGGING
|
|||
let BATCH_RANGE_START
|
||||
let BATCH_RANGE_END
|
||||
let BATCH_MAX_TIME_SPAN_IN_MS
|
||||
let BATCHED_UPDATE_RUNNING = false
|
||||
|
||||
/**
|
||||
* @typedef {import("mongodb").Collection} Collection
|
||||
|
@ -35,7 +34,6 @@ let BATCHED_UPDATE_RUNNING = false
|
|||
* @property {string} [BATCH_RANGE_START]
|
||||
* @property {string} [BATCH_SIZE]
|
||||
* @property {string} [VERBOSE_LOGGING]
|
||||
* @property {(progress: string) => Promise<void>} [trackProgress]
|
||||
*/
|
||||
|
||||
/**
|
||||
|
@ -211,71 +209,59 @@ async function batchedUpdate(
|
|||
update,
|
||||
projection,
|
||||
findOptions,
|
||||
batchedUpdateOptions = {}
|
||||
batchedUpdateOptions
|
||||
) {
|
||||
// only a single batchedUpdate can run at a time due to global variables
|
||||
if (BATCHED_UPDATE_RUNNING) {
|
||||
throw new Error('batchedUpdate is already running')
|
||||
ID_EDGE_PAST = await getIdEdgePast(collection)
|
||||
if (!ID_EDGE_PAST) {
|
||||
console.warn(
|
||||
`The collection ${collection.collectionName} appears to be empty.`
|
||||
)
|
||||
return 0
|
||||
}
|
||||
try {
|
||||
BATCHED_UPDATE_RUNNING = true
|
||||
ID_EDGE_PAST = await getIdEdgePast(collection)
|
||||
if (!ID_EDGE_PAST) {
|
||||
console.warn(
|
||||
`The collection ${collection.collectionName} appears to be empty.`
|
||||
)
|
||||
return 0
|
||||
}
|
||||
refreshGlobalOptionsForBatchedUpdate(batchedUpdateOptions)
|
||||
const { trackProgress = async progress => console.warn(progress) } =
|
||||
batchedUpdateOptions
|
||||
refreshGlobalOptionsForBatchedUpdate(batchedUpdateOptions)
|
||||
|
||||
findOptions = findOptions || {}
|
||||
findOptions.readPreference = READ_PREFERENCE_SECONDARY
|
||||
findOptions = findOptions || {}
|
||||
findOptions.readPreference = READ_PREFERENCE_SECONDARY
|
||||
|
||||
projection = projection || { _id: 1 }
|
||||
let nextBatch
|
||||
let updated = 0
|
||||
let start = BATCH_RANGE_START
|
||||
projection = projection || { _id: 1 }
|
||||
let nextBatch
|
||||
let updated = 0
|
||||
let start = BATCH_RANGE_START
|
||||
|
||||
while (start !== BATCH_RANGE_END) {
|
||||
let end = getNextEnd(start)
|
||||
nextBatch = await getNextBatch(
|
||||
collection,
|
||||
query,
|
||||
start,
|
||||
end,
|
||||
projection,
|
||||
findOptions
|
||||
)
|
||||
if (nextBatch.length > 0) {
|
||||
end = nextBatch[nextBatch.length - 1]._id
|
||||
updated += nextBatch.length
|
||||
while (start !== BATCH_RANGE_END) {
|
||||
let end = getNextEnd(start)
|
||||
nextBatch = await getNextBatch(
|
||||
collection,
|
||||
query,
|
||||
start,
|
||||
end,
|
||||
projection,
|
||||
findOptions
|
||||
)
|
||||
if (nextBatch.length > 0) {
|
||||
end = nextBatch[nextBatch.length - 1]._id
|
||||
updated += nextBatch.length
|
||||
|
||||
if (VERBOSE_LOGGING) {
|
||||
console.log(
|
||||
`Running update on batch with ids ${JSON.stringify(
|
||||
nextBatch.map(entry => entry._id)
|
||||
)}`
|
||||
)
|
||||
}
|
||||
await trackProgress(
|
||||
`Running update on batch ending ${renderObjectId(end)}`
|
||||
if (VERBOSE_LOGGING) {
|
||||
console.log(
|
||||
`Running update on batch with ids ${JSON.stringify(
|
||||
nextBatch.map(entry => entry._id)
|
||||
)}`
|
||||
)
|
||||
|
||||
if (typeof update === 'function') {
|
||||
await update(nextBatch)
|
||||
} else {
|
||||
await performUpdate(collection, nextBatch, update)
|
||||
}
|
||||
} else {
|
||||
console.error(`Running update on batch ending ${renderObjectId(end)}`)
|
||||
}
|
||||
|
||||
if (typeof update === 'function') {
|
||||
await update(nextBatch)
|
||||
} else {
|
||||
await performUpdate(collection, nextBatch, update)
|
||||
}
|
||||
await trackProgress(`Completed batch ending ${renderObjectId(end)}`)
|
||||
start = end
|
||||
}
|
||||
return updated
|
||||
} finally {
|
||||
BATCHED_UPDATE_RUNNING = false
|
||||
console.error(`Completed batch ending ${renderObjectId(end)}`)
|
||||
start = end
|
||||
}
|
||||
return updated
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
mongo-utils
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -16,12 +16,12 @@
|
|||
"author": "Overleaf (https://www.overleaf.com)",
|
||||
"license": "AGPL-3.0-only",
|
||||
"dependencies": {
|
||||
"mongodb": "6.12.0",
|
||||
"mongodb": "6.10.0",
|
||||
"mongodb-legacy": "6.1.3"
|
||||
},
|
||||
"devDependencies": {
|
||||
"chai": "^4.3.6",
|
||||
"mocha": "^11.1.0",
|
||||
"mocha": "^10.2.0",
|
||||
"sandboxed-module": "^2.0.4",
|
||||
"sinon": "^9.2.4",
|
||||
"sinon-chai": "^3.7.0",
|
||||
|
|
1
libraries/o-error/.dockerignore
Normal file
1
libraries/o-error/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
5
libraries/o-error/.gitignore
vendored
Normal file
5
libraries/o-error/.gitignore
vendored
Normal file
|
@ -0,0 +1,5 @@
|
|||
.nyc_output
|
||||
coverage
|
||||
node_modules/
|
||||
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
o-error
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -1,34 +1,20 @@
|
|||
// @ts-check
|
||||
|
||||
/**
|
||||
* Light-weight helpers for handling JavaScript Errors in node.js and the
|
||||
* browser.
|
||||
*/
|
||||
class OError extends Error {
|
||||
/**
|
||||
* The error that is the underlying cause of this error
|
||||
*
|
||||
* @type {unknown}
|
||||
*/
|
||||
cause
|
||||
|
||||
/**
|
||||
* List of errors encountered as the callback chain is unwound
|
||||
*
|
||||
* @type {TaggedError[] | undefined}
|
||||
*/
|
||||
_oErrorTags
|
||||
|
||||
/**
|
||||
* @param {string} message as for built-in Error
|
||||
* @param {Object} [info] extra data to attach to the error
|
||||
* @param {unknown} [cause] the internal error that caused this error
|
||||
* @param {Error} [cause] the internal error that caused this error
|
||||
*/
|
||||
constructor(message, info, cause) {
|
||||
super(message)
|
||||
this.name = this.constructor.name
|
||||
if (info) this.info = info
|
||||
if (cause) this.cause = cause
|
||||
/** @private @type {Array<TaggedError> | undefined} */
|
||||
this._oErrorTags // eslint-disable-line
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -45,7 +31,7 @@ class OError extends Error {
|
|||
/**
|
||||
* Wrap the given error, which caused this error.
|
||||
*
|
||||
* @param {unknown} cause the internal error that caused this error
|
||||
* @param {Error} cause the internal error that caused this error
|
||||
* @return {this}
|
||||
*/
|
||||
withCause(cause) {
|
||||
|
@ -79,16 +65,13 @@ class OError extends Error {
|
|||
* }
|
||||
* }
|
||||
*
|
||||
* @template {unknown} E
|
||||
* @param {E} error the error to tag
|
||||
* @param {Error} error the error to tag
|
||||
* @param {string} [message] message with which to tag `error`
|
||||
* @param {Object} [info] extra data with wich to tag `error`
|
||||
* @return {E} the modified `error` argument
|
||||
* @return {Error} the modified `error` argument
|
||||
*/
|
||||
static tag(error, message, info) {
|
||||
const oError = /** @type {{ _oErrorTags: TaggedError[] | undefined }} */ (
|
||||
error
|
||||
)
|
||||
const oError = /** @type{OError} */ (error)
|
||||
|
||||
if (!oError._oErrorTags) oError._oErrorTags = []
|
||||
|
||||
|
@ -119,7 +102,7 @@ class OError extends Error {
|
|||
*
|
||||
* If an info property is repeated, the last one wins.
|
||||
*
|
||||
* @param {unknown} error any error (may or may not be an `OError`)
|
||||
* @param {Error | null | undefined} error any error (may or may not be an `OError`)
|
||||
* @return {Object}
|
||||
*/
|
||||
static getFullInfo(error) {
|
||||
|
@ -146,7 +129,7 @@ class OError extends Error {
|
|||
* Return the `stack` property from `error`, including the `stack`s for any
|
||||
* tagged errors added with `OError.tag` and for any `cause`s.
|
||||
*
|
||||
* @param {unknown} error any error (may or may not be an `OError`)
|
||||
* @param {Error | null | undefined} error any error (may or may not be an `OError`)
|
||||
* @return {string}
|
||||
*/
|
||||
static getFullStack(error) {
|
||||
|
@ -160,7 +143,7 @@ class OError extends Error {
|
|||
stack += `\n${oError._oErrorTags.map(tag => tag.stack).join('\n')}`
|
||||
}
|
||||
|
||||
const causeStack = OError.getFullStack(oError.cause)
|
||||
const causeStack = oError.cause && OError.getFullStack(oError.cause)
|
||||
if (causeStack) {
|
||||
stack += '\ncaused by:\n' + indent(causeStack)
|
||||
}
|
||||
|
|
|
@ -34,7 +34,7 @@
|
|||
"@types/chai": "^4.3.0",
|
||||
"@types/node": "^18.17.4",
|
||||
"chai": "^4.3.6",
|
||||
"mocha": "^11.1.0",
|
||||
"mocha": "^10.2.0",
|
||||
"typescript": "^5.0.4"
|
||||
}
|
||||
}
|
||||
|
|
|
@ -268,11 +268,6 @@ describe('utils', function () {
|
|||
expect(OError.getFullInfo(null)).to.deep.equal({})
|
||||
})
|
||||
|
||||
it('works when given a string', function () {
|
||||
const err = 'not an error instance'
|
||||
expect(OError.getFullInfo(err)).to.deep.equal({})
|
||||
})
|
||||
|
||||
it('works on a normal error', function () {
|
||||
const err = new Error('foo')
|
||||
expect(OError.getFullInfo(err)).to.deep.equal({})
|
||||
|
|
|
@ -35,14 +35,6 @@ describe('OError', function () {
|
|||
expect(err2.cause.message).to.equal('cause 2')
|
||||
})
|
||||
|
||||
it('accepts non-Error causes', function () {
|
||||
const err1 = new OError('foo', {}, 'not-an-error')
|
||||
expect(err1.cause).to.equal('not-an-error')
|
||||
|
||||
const err2 = new OError('foo').withCause('not-an-error')
|
||||
expect(err2.cause).to.equal('not-an-error')
|
||||
})
|
||||
|
||||
it('handles a custom error type with a cause', function () {
|
||||
function doSomethingBadInternally() {
|
||||
throw new Error('internal error')
|
||||
|
|
1
libraries/object-persistor/.dockerignore
Normal file
1
libraries/object-persistor/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
4
libraries/object-persistor/.gitignore
vendored
Normal file
4
libraries/object-persistor/.gitignore
vendored
Normal file
|
@ -0,0 +1,4 @@
|
|||
/node_modules
|
||||
*.swp
|
||||
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
object-persistor
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -34,9 +34,9 @@
|
|||
"devDependencies": {
|
||||
"chai": "^4.3.6",
|
||||
"chai-as-promised": "^7.1.1",
|
||||
"mocha": "^11.1.0",
|
||||
"mocha": "^10.2.0",
|
||||
"mock-fs": "^5.2.0",
|
||||
"mongodb": "6.12.0",
|
||||
"mongodb": "6.10.0",
|
||||
"sandboxed-module": "^2.0.4",
|
||||
"sinon": "^9.2.4",
|
||||
"sinon-chai": "^3.7.0",
|
||||
|
|
|
@ -305,10 +305,8 @@ module.exports = class FSPersistor extends AbstractPersistor {
|
|||
|
||||
async _listDirectory(path) {
|
||||
if (this.useSubdirectories) {
|
||||
// eslint-disable-next-line @typescript-eslint/return-await
|
||||
return await glob(Path.join(path, '**'))
|
||||
} else {
|
||||
// eslint-disable-next-line @typescript-eslint/return-await
|
||||
return await glob(`${path}_*`)
|
||||
}
|
||||
}
|
||||
|
|
|
@ -33,10 +33,6 @@ const AES256_KEY_LENGTH = 32
|
|||
* @property {() => Promise<Array<RootKeyEncryptionKey>>} getRootKeyEncryptionKeys
|
||||
*/
|
||||
|
||||
/**
|
||||
* @typedef {import('./types').ListDirectoryResult} ListDirectoryResult
|
||||
*/
|
||||
|
||||
/**
|
||||
* Helper function to make TS happy when accessing error properties
|
||||
* AWSError is not an actual class, so we cannot use instanceof.
|
||||
|
@ -347,10 +343,9 @@ class PerProjectEncryptedS3Persistor extends S3Persistor {
|
|||
}
|
||||
|
||||
async deleteDirectory(bucketName, path, continuationToken) {
|
||||
// Let [Settings.pathToProjectFolder] validate the project path before deleting things.
|
||||
const { projectFolder, dekPath } = this.#buildProjectPaths(bucketName, path)
|
||||
// Note: Listing/Deleting a prefix does not require SSE-C credentials.
|
||||
await super.deleteDirectory(bucketName, path, continuationToken)
|
||||
const { projectFolder, dekPath } = this.#buildProjectPaths(bucketName, path)
|
||||
if (projectFolder === path) {
|
||||
await super.deleteObject(
|
||||
this.#settings.dataEncryptionKeyBucketName,
|
||||
|
@ -395,9 +390,9 @@ class PerProjectEncryptedS3Persistor extends S3Persistor {
|
|||
* A general "cache" for project keys is another alternative. For now, use a helper class.
|
||||
*/
|
||||
class CachedPerProjectEncryptedS3Persistor {
|
||||
/** @type SSECOptions */
|
||||
/** @type SSECOptions */
|
||||
#projectKeyOptions
|
||||
/** @type PerProjectEncryptedS3Persistor */
|
||||
/** @type PerProjectEncryptedS3Persistor */
|
||||
#parent
|
||||
|
||||
/**
|
||||
|
@ -418,26 +413,6 @@ class CachedPerProjectEncryptedS3Persistor {
|
|||
return await this.sendStream(bucketName, path, fs.createReadStream(fsPath))
|
||||
}
|
||||
|
||||
/**
|
||||
*
|
||||
* @param {string} bucketName
|
||||
* @param {string} path
|
||||
* @return {Promise<number>}
|
||||
*/
|
||||
async getObjectSize(bucketName, path) {
|
||||
return await this.#parent.getObjectSize(bucketName, path)
|
||||
}
|
||||
|
||||
/**
|
||||
*
|
||||
* @param {string} bucketName
|
||||
* @param {string} path
|
||||
* @return {Promise<ListDirectoryResult>}
|
||||
*/
|
||||
async listDirectory(bucketName, path) {
|
||||
return await this.#parent.listDirectory(bucketName, path)
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {string} bucketName
|
||||
* @param {string} path
|
||||
|
|
|
@ -20,18 +20,6 @@ const { URL } = require('node:url')
|
|||
const { WriteError, ReadError, NotFoundError } = require('./Errors')
|
||||
const zlib = require('node:zlib')
|
||||
|
||||
/**
|
||||
* @typedef {import('aws-sdk/clients/s3').ListObjectsV2Output} ListObjectsV2Output
|
||||
*/
|
||||
|
||||
/**
|
||||
* @typedef {import('aws-sdk/clients/s3').Object} S3Object
|
||||
*/
|
||||
|
||||
/**
|
||||
* @typedef {import('./types').ListDirectoryResult} ListDirectoryResult
|
||||
*/
|
||||
|
||||
/**
|
||||
* Wrapper with private fields to avoid revealing them on console, JSON.stringify or similar.
|
||||
*/
|
||||
|
@ -278,12 +266,26 @@ class S3Persistor extends AbstractPersistor {
|
|||
* @return {Promise<void>}
|
||||
*/
|
||||
async deleteDirectory(bucketName, key, continuationToken) {
|
||||
const { contents, response } = await this.listDirectory(
|
||||
bucketName,
|
||||
key,
|
||||
continuationToken
|
||||
)
|
||||
const objects = contents.map(item => ({ Key: item.Key || '' }))
|
||||
let response
|
||||
const options = { Bucket: bucketName, Prefix: key }
|
||||
if (continuationToken) {
|
||||
options.ContinuationToken = continuationToken
|
||||
}
|
||||
|
||||
try {
|
||||
response = await this._getClientForBucket(bucketName)
|
||||
.listObjectsV2(options)
|
||||
.promise()
|
||||
} catch (err) {
|
||||
throw PersistorHelper.wrapError(
|
||||
err,
|
||||
'failed to list objects in S3',
|
||||
{ bucketName, key },
|
||||
ReadError
|
||||
)
|
||||
}
|
||||
|
||||
const objects = response.Contents?.map(item => ({ Key: item.Key || '' }))
|
||||
if (objects?.length) {
|
||||
try {
|
||||
await this._getClientForBucket(bucketName)
|
||||
|
@ -314,36 +316,6 @@ class S3Persistor extends AbstractPersistor {
|
|||
}
|
||||
}
|
||||
|
||||
/**
|
||||
*
|
||||
* @param {string} bucketName
|
||||
* @param {string} key
|
||||
* @param {string} [continuationToken]
|
||||
* @return {Promise<ListDirectoryResult>}
|
||||
*/
|
||||
async listDirectory(bucketName, key, continuationToken) {
|
||||
let response
|
||||
const options = { Bucket: bucketName, Prefix: key }
|
||||
if (continuationToken) {
|
||||
options.ContinuationToken = continuationToken
|
||||
}
|
||||
|
||||
try {
|
||||
response = await this._getClientForBucket(bucketName)
|
||||
.listObjectsV2(options)
|
||||
.promise()
|
||||
} catch (err) {
|
||||
throw PersistorHelper.wrapError(
|
||||
err,
|
||||
'failed to list objects in S3',
|
||||
{ bucketName, key },
|
||||
ReadError
|
||||
)
|
||||
}
|
||||
|
||||
return { contents: response.Contents ?? [], response }
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {string} bucketName
|
||||
* @param {string} key
|
||||
|
|
6
libraries/object-persistor/src/types.d.ts
vendored
6
libraries/object-persistor/src/types.d.ts
vendored
|
@ -1,6 +0,0 @@
|
|||
import type { ListObjectsV2Output, Object } from 'aws-sdk/clients/s3'
|
||||
|
||||
export type ListDirectoryResult = {
|
||||
contents: Array<Object>
|
||||
response: ListObjectsV2Output
|
||||
}
|
1
libraries/overleaf-editor-core/.dockerignore
Normal file
1
libraries/overleaf-editor-core/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
5
libraries/overleaf-editor-core/.gitignore
vendored
Normal file
5
libraries/overleaf-editor-core/.gitignore
vendored
Normal file
|
@ -0,0 +1,5 @@
|
|||
/coverage
|
||||
/node_modules
|
||||
|
||||
# managed by monorepo$ bin/update_build_scripts
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
overleaf-editor-core
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -18,7 +18,6 @@ const MoveFileOperation = require('./lib/operation/move_file_operation')
|
|||
const SetCommentStateOperation = require('./lib/operation/set_comment_state_operation')
|
||||
const EditFileOperation = require('./lib/operation/edit_file_operation')
|
||||
const EditNoOperation = require('./lib/operation/edit_no_operation')
|
||||
const EditOperationTransformer = require('./lib/operation/edit_operation_transformer')
|
||||
const SetFileMetadataOperation = require('./lib/operation/set_file_metadata_operation')
|
||||
const NoOperation = require('./lib/operation/no_operation')
|
||||
const Operation = require('./lib/operation')
|
||||
|
@ -44,8 +43,6 @@ const TrackingProps = require('./lib/file_data/tracking_props')
|
|||
const Range = require('./lib/range')
|
||||
const CommentList = require('./lib/file_data/comment_list')
|
||||
const LazyStringFileData = require('./lib/file_data/lazy_string_file_data')
|
||||
const StringFileData = require('./lib/file_data/string_file_data')
|
||||
const EditOperationBuilder = require('./lib/operation/edit_operation_builder')
|
||||
|
||||
exports.AddCommentOperation = AddCommentOperation
|
||||
exports.Author = Author
|
||||
|
@ -61,7 +58,6 @@ exports.DeleteCommentOperation = DeleteCommentOperation
|
|||
exports.File = File
|
||||
exports.FileMap = FileMap
|
||||
exports.LazyStringFileData = LazyStringFileData
|
||||
exports.StringFileData = StringFileData
|
||||
exports.History = History
|
||||
exports.Label = Label
|
||||
exports.AddFileOperation = AddFileOperation
|
||||
|
@ -69,8 +65,6 @@ exports.MoveFileOperation = MoveFileOperation
|
|||
exports.SetCommentStateOperation = SetCommentStateOperation
|
||||
exports.EditFileOperation = EditFileOperation
|
||||
exports.EditNoOperation = EditNoOperation
|
||||
exports.EditOperationBuilder = EditOperationBuilder
|
||||
exports.EditOperationTransformer = EditOperationTransformer
|
||||
exports.SetFileMetadataOperation = SetFileMetadataOperation
|
||||
exports.NoOperation = NoOperation
|
||||
exports.Operation = Operation
|
||||
|
|
|
@ -13,7 +13,7 @@ const V2DocVersions = require('./v2_doc_versions')
|
|||
|
||||
/**
|
||||
* @import Author from "./author"
|
||||
* @import { BlobStore, RawChange, ReadonlyBlobStore } from "./types"
|
||||
* @import { BlobStore } from "./types"
|
||||
*/
|
||||
|
||||
/**
|
||||
|
@ -54,7 +54,7 @@ class Change {
|
|||
/**
|
||||
* For serialization.
|
||||
*
|
||||
* @return {RawChange}
|
||||
* @return {Object}
|
||||
*/
|
||||
toRaw() {
|
||||
function toRaw(object) {
|
||||
|
@ -100,9 +100,6 @@ class Change {
|
|||
)
|
||||
}
|
||||
|
||||
/**
|
||||
* @return {Operation[]}
|
||||
*/
|
||||
getOperations() {
|
||||
return this.operations
|
||||
}
|
||||
|
@ -219,7 +216,7 @@ class Change {
|
|||
* If this Change contains any File objects, load them.
|
||||
*
|
||||
* @param {string} kind see {File#load}
|
||||
* @param {ReadonlyBlobStore} blobStore
|
||||
* @param {BlobStore} blobStore
|
||||
* @return {Promise<void>}
|
||||
*/
|
||||
async loadFiles(kind, blobStore) {
|
||||
|
@ -251,24 +248,6 @@ class Change {
|
|||
* @param {boolean} [opts.strict] - Do not ignore recoverable errors
|
||||
*/
|
||||
applyTo(snapshot, opts = {}) {
|
||||
// eslint-disable-next-line no-unused-vars
|
||||
for (const operation of this.iterativelyApplyTo(snapshot, opts)) {
|
||||
// Nothing to do: we're just consuming the iterator for the side effects
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Generator that applies this change to a snapshot and yields each
|
||||
* operation after it has been applied.
|
||||
*
|
||||
* Recoverable errors (caused by historical bad data) are ignored unless
|
||||
* opts.strict is true
|
||||
*
|
||||
* @param {Snapshot} snapshot modified in place
|
||||
* @param {object} opts
|
||||
* @param {boolean} [opts.strict] - Do not ignore recoverable errors
|
||||
*/
|
||||
*iterativelyApplyTo(snapshot, opts = {}) {
|
||||
assert.object(snapshot, 'bad snapshot')
|
||||
|
||||
for (const operation of this.operations) {
|
||||
|
@ -282,7 +261,6 @@ class Change {
|
|||
throw err
|
||||
}
|
||||
}
|
||||
yield operation
|
||||
}
|
||||
|
||||
// update project version if present in change
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
// @ts-check
|
||||
|
||||
/**
|
||||
* @import { ClearTrackingPropsRawData, TrackingDirective } from '../types'
|
||||
* @import { ClearTrackingPropsRawData } from '../types'
|
||||
*/
|
||||
|
||||
class ClearTrackingProps {
|
||||
|
@ -11,27 +11,12 @@ class ClearTrackingProps {
|
|||
|
||||
/**
|
||||
* @param {any} other
|
||||
* @returns {other is ClearTrackingProps}
|
||||
* @returns {boolean}
|
||||
*/
|
||||
equals(other) {
|
||||
return other instanceof ClearTrackingProps
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {TrackingDirective} other
|
||||
* @returns {other is ClearTrackingProps}
|
||||
*/
|
||||
canMergeWith(other) {
|
||||
return other instanceof ClearTrackingProps
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {TrackingDirective} other
|
||||
*/
|
||||
mergeWith(other) {
|
||||
return this
|
||||
}
|
||||
|
||||
/**
|
||||
* @returns {ClearTrackingPropsRawData}
|
||||
*/
|
||||
|
|
|
@ -11,7 +11,7 @@ const EditOperation = require('../operation/edit_operation')
|
|||
const EditOperationBuilder = require('../operation/edit_operation_builder')
|
||||
|
||||
/**
|
||||
* @import { BlobStore, ReadonlyBlobStore, RangesBlob, RawHashFileData, RawLazyStringFileData } from '../types'
|
||||
* @import { BlobStore, ReadonlyBlobStore, RangesBlob, RawFileData, RawLazyStringFileData } from '../types'
|
||||
*/
|
||||
|
||||
class LazyStringFileData extends FileData {
|
||||
|
@ -159,11 +159,11 @@ class LazyStringFileData extends FileData {
|
|||
|
||||
/** @inheritdoc
|
||||
* @param {BlobStore} blobStore
|
||||
* @return {Promise<RawHashFileData>}
|
||||
* @return {Promise<RawFileData>}
|
||||
*/
|
||||
async store(blobStore) {
|
||||
if (this.operations.length === 0) {
|
||||
/** @type RawHashFileData */
|
||||
/** @type RawFileData */
|
||||
const raw = { hash: this.hash }
|
||||
if (this.rangesHash) {
|
||||
raw.rangesHash = this.rangesHash
|
||||
|
@ -171,11 +171,9 @@ class LazyStringFileData extends FileData {
|
|||
return raw
|
||||
}
|
||||
const eager = await this.toEager(blobStore)
|
||||
const raw = await eager.store(blobStore)
|
||||
this.hash = raw.hash
|
||||
this.rangesHash = raw.rangesHash
|
||||
this.operations.length = 0
|
||||
return raw
|
||||
/** @type RawFileData */
|
||||
return await eager.store(blobStore)
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -8,7 +8,7 @@ const CommentList = require('./comment_list')
|
|||
const TrackedChangeList = require('./tracked_change_list')
|
||||
|
||||
/**
|
||||
* @import { StringFileRawData, RawHashFileData, BlobStore, CommentRawData } from "../types"
|
||||
* @import { StringFileRawData, RawFileData, BlobStore, CommentRawData } from "../types"
|
||||
* @import { TrackedChangeRawData, RangesBlob } from "../types"
|
||||
* @import EditOperation from "../operation/edit_operation"
|
||||
*/
|
||||
|
@ -88,14 +88,6 @@ class StringFileData extends FileData {
|
|||
return content
|
||||
}
|
||||
|
||||
/**
|
||||
* Return docstore view of a doc: each line separated
|
||||
* @return {string[]}
|
||||
*/
|
||||
getLines() {
|
||||
return this.getContent({ filterTrackedDeletes: true }).split('\n')
|
||||
}
|
||||
|
||||
/** @inheritdoc */
|
||||
getByteLength() {
|
||||
return Buffer.byteLength(this.content)
|
||||
|
@ -139,7 +131,7 @@ class StringFileData extends FileData {
|
|||
/**
|
||||
* @inheritdoc
|
||||
* @param {BlobStore} blobStore
|
||||
* @return {Promise<RawHashFileData>}
|
||||
* @return {Promise<RawFileData>}
|
||||
*/
|
||||
async store(blobStore) {
|
||||
const blob = await blobStore.putString(this.content)
|
||||
|
|
|
@ -84,21 +84,6 @@ class TrackedChange {
|
|||
)
|
||||
)
|
||||
}
|
||||
|
||||
/**
|
||||
* Return an equivalent tracked change whose extent is limited to the given
|
||||
* range
|
||||
*
|
||||
* @param {Range} range
|
||||
* @returns {TrackedChange | null} - the result or null if the intersection is empty
|
||||
*/
|
||||
intersectRange(range) {
|
||||
const intersection = this.range.intersect(range)
|
||||
if (intersection == null) {
|
||||
return null
|
||||
}
|
||||
return new TrackedChange(intersection, this.tracking)
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = TrackedChange
|
||||
|
|
|
@ -2,11 +2,9 @@
|
|||
const Range = require('../range')
|
||||
const TrackedChange = require('./tracked_change')
|
||||
const TrackingProps = require('../file_data/tracking_props')
|
||||
const { InsertOp, RemoveOp, RetainOp } = require('../operation/scan_op')
|
||||
|
||||
/**
|
||||
* @import { TrackingDirective, TrackedChangeRawData } from "../types"
|
||||
* @import TextOperation from "../operation/text_operation"
|
||||
*/
|
||||
|
||||
class TrackedChangeList {
|
||||
|
@ -60,22 +58,6 @@ class TrackedChangeList {
|
|||
return this._trackedChanges.filter(change => range.contains(change.range))
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns tracked changes that overlap with the given range
|
||||
* @param {Range} range
|
||||
* @returns {TrackedChange[]}
|
||||
*/
|
||||
intersectRange(range) {
|
||||
const changes = []
|
||||
for (const change of this._trackedChanges) {
|
||||
const intersection = change.intersectRange(range)
|
||||
if (intersection != null) {
|
||||
changes.push(intersection)
|
||||
}
|
||||
}
|
||||
return changes
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the tracking props for a given range.
|
||||
* @param {Range} range
|
||||
|
@ -107,8 +89,6 @@ class TrackedChangeList {
|
|||
|
||||
/**
|
||||
* Collapses consecutive (and compatible) ranges
|
||||
*
|
||||
* @private
|
||||
* @returns {void}
|
||||
*/
|
||||
_mergeRanges() {
|
||||
|
@ -137,28 +117,12 @@ class TrackedChangeList {
|
|||
}
|
||||
|
||||
/**
|
||||
* Apply an insert operation
|
||||
*
|
||||
* @param {number} cursor
|
||||
* @param {string} insertedText
|
||||
* @param {{tracking?: TrackingProps}} opts
|
||||
*/
|
||||
applyInsert(cursor, insertedText, opts = {}) {
|
||||
this._applyInsert(cursor, insertedText, opts)
|
||||
this._mergeRanges()
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply an insert operation
|
||||
*
|
||||
* This method will not merge ranges at the end
|
||||
*
|
||||
* @private
|
||||
* @param {number} cursor
|
||||
* @param {string} insertedText
|
||||
* @param {{tracking?: TrackingProps}} [opts]
|
||||
*/
|
||||
_applyInsert(cursor, insertedText, opts = {}) {
|
||||
const newTrackedChanges = []
|
||||
for (const trackedChange of this._trackedChanges) {
|
||||
if (
|
||||
|
@ -207,29 +171,15 @@ class TrackedChangeList {
|
|||
newTrackedChanges.push(newTrackedChange)
|
||||
}
|
||||
this._trackedChanges = newTrackedChanges
|
||||
this._mergeRanges()
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply a delete operation to the list of tracked changes
|
||||
*
|
||||
* @param {number} cursor
|
||||
* @param {number} length
|
||||
*/
|
||||
applyDelete(cursor, length) {
|
||||
this._applyDelete(cursor, length)
|
||||
this._mergeRanges()
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply a delete operation to the list of tracked changes
|
||||
*
|
||||
* This method will not merge ranges at the end
|
||||
*
|
||||
* @private
|
||||
* @param {number} cursor
|
||||
* @param {number} length
|
||||
*/
|
||||
_applyDelete(cursor, length) {
|
||||
const newTrackedChanges = []
|
||||
for (const trackedChange of this._trackedChanges) {
|
||||
const deletedRange = new Range(cursor, length)
|
||||
|
@ -255,31 +205,15 @@ class TrackedChangeList {
|
|||
}
|
||||
}
|
||||
this._trackedChanges = newTrackedChanges
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply a retain operation to the list of tracked changes
|
||||
*
|
||||
* @param {number} cursor
|
||||
* @param {number} length
|
||||
* @param {{tracking?: TrackingDirective}} [opts]
|
||||
*/
|
||||
applyRetain(cursor, length, opts = {}) {
|
||||
this._applyRetain(cursor, length, opts)
|
||||
this._mergeRanges()
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply a retain operation to the list of tracked changes
|
||||
*
|
||||
* This method will not merge ranges at the end
|
||||
*
|
||||
* @private
|
||||
* @param {number} cursor
|
||||
* @param {number} length
|
||||
* @param {{tracking?: TrackingDirective}} opts
|
||||
*/
|
||||
_applyRetain(cursor, length, opts = {}) {
|
||||
applyRetain(cursor, length, opts = {}) {
|
||||
// If there's no tracking info, leave everything as-is
|
||||
if (!opts.tracking) {
|
||||
return
|
||||
|
@ -335,31 +269,6 @@ class TrackedChangeList {
|
|||
newTrackedChanges.push(newTrackedChange)
|
||||
}
|
||||
this._trackedChanges = newTrackedChanges
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply a text operation to the list of tracked changes
|
||||
*
|
||||
* Ranges are merged only once at the end, for performance and to avoid
|
||||
* problematic edge cases where intermediate ranges get incorrectly merged.
|
||||
*
|
||||
* @param {TextOperation} operation
|
||||
*/
|
||||
applyTextOperation(operation) {
|
||||
// this cursor tracks the destination document that gets modified as
|
||||
// operations are applied to it.
|
||||
let cursor = 0
|
||||
for (const op of operation.ops) {
|
||||
if (op instanceof InsertOp) {
|
||||
this._applyInsert(cursor, op.insertion, { tracking: op.tracking })
|
||||
cursor += op.insertion.length
|
||||
} else if (op instanceof RemoveOp) {
|
||||
this._applyDelete(cursor, op.length)
|
||||
} else if (op instanceof RetainOp) {
|
||||
this._applyRetain(cursor, op.length, { tracking: op.tracking })
|
||||
cursor += op.length
|
||||
}
|
||||
}
|
||||
this._mergeRanges()
|
||||
}
|
||||
}
|
||||
|
|
|
@ -62,35 +62,6 @@ class TrackingProps {
|
|||
this.ts.getTime() === other.ts.getTime()
|
||||
)
|
||||
}
|
||||
|
||||
/**
|
||||
* Are these tracking props compatible with the other tracking props for merging
|
||||
* ranges?
|
||||
*
|
||||
* @param {TrackingDirective} other
|
||||
* @returns {other is TrackingProps}
|
||||
*/
|
||||
canMergeWith(other) {
|
||||
if (!(other instanceof TrackingProps)) {
|
||||
return false
|
||||
}
|
||||
return this.type === other.type && this.userId === other.userId
|
||||
}
|
||||
|
||||
/**
|
||||
* Merge two tracking props
|
||||
*
|
||||
* Assumes that `canMerge(other)` returns true
|
||||
*
|
||||
* @param {TrackingDirective} other
|
||||
*/
|
||||
mergeWith(other) {
|
||||
if (!this.canMergeWith(other)) {
|
||||
throw new Error('Cannot merge with incompatible tracking props')
|
||||
}
|
||||
const ts = this.ts <= other.ts ? this.ts : other.ts
|
||||
return new TrackingProps(this.type, this.userId, ts)
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = TrackingProps
|
||||
|
|
|
@ -22,7 +22,7 @@ class NonUniquePathnameError extends PathnameError {
|
|||
* @param {string[]} pathnames
|
||||
*/
|
||||
constructor(pathnames) {
|
||||
super('pathnames are not unique', { pathnames })
|
||||
super('pathnames are not unique: ' + pathnames, { pathnames })
|
||||
this.pathnames = pathnames
|
||||
}
|
||||
}
|
||||
|
@ -30,13 +30,9 @@ class NonUniquePathnameError extends PathnameError {
|
|||
class BadPathnameError extends PathnameError {
|
||||
/**
|
||||
* @param {string} pathname
|
||||
* @param {string} reason
|
||||
*/
|
||||
constructor(pathname, reason) {
|
||||
if (pathname.length > 10) {
|
||||
pathname = pathname.slice(0, 5) + '...' + pathname.slice(-5)
|
||||
}
|
||||
super('invalid pathname', { reason, pathname })
|
||||
constructor(pathname) {
|
||||
super(pathname + ' is not a valid pathname', { pathname })
|
||||
this.pathname = pathname
|
||||
}
|
||||
}
|
||||
|
@ -46,7 +42,7 @@ class PathnameConflictError extends PathnameError {
|
|||
* @param {string} pathname
|
||||
*/
|
||||
constructor(pathname) {
|
||||
super('pathname conflicts with another file', { pathname })
|
||||
super(`pathname '${pathname}' conflicts with another file`, { pathname })
|
||||
this.pathname = pathname
|
||||
}
|
||||
}
|
||||
|
@ -56,7 +52,7 @@ class FileNotFoundError extends PathnameError {
|
|||
* @param {string} pathname
|
||||
*/
|
||||
constructor(pathname) {
|
||||
super('file does not exist', { pathname })
|
||||
super(`file ${pathname} does not exist`, { pathname })
|
||||
this.pathname = pathname
|
||||
}
|
||||
}
|
||||
|
@ -319,9 +315,8 @@ function checkPathnamesAreUnique(files) {
|
|||
*/
|
||||
function checkPathname(pathname) {
|
||||
assert.nonEmptyString(pathname, 'bad pathname')
|
||||
const [isClean, reason] = safePathname.isCleanDebug(pathname)
|
||||
if (isClean) return
|
||||
throw new FileMap.BadPathnameError(pathname, reason)
|
||||
if (safePathname.isClean(pathname)) return
|
||||
throw new FileMap.BadPathnameError(pathname)
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
|
@ -7,7 +7,7 @@ const Change = require('./change')
|
|||
const Snapshot = require('./snapshot')
|
||||
|
||||
/**
|
||||
* @import { BlobStore, ReadonlyBlobStore } from "./types"
|
||||
* @import { BlobStore } from "./types"
|
||||
*/
|
||||
|
||||
class History {
|
||||
|
@ -85,7 +85,7 @@ class History {
|
|||
* If this History contains any File objects, load them.
|
||||
*
|
||||
* @param {string} kind see {File#load}
|
||||
* @param {ReadonlyBlobStore} blobStore
|
||||
* @param {BlobStore} blobStore
|
||||
* @return {Promise<void>}
|
||||
*/
|
||||
async loadFiles(kind, blobStore) {
|
||||
|
|
|
@ -36,20 +36,6 @@ class EditOperationBuilder {
|
|||
}
|
||||
throw new Error('Unsupported operation in EditOperationBuilder.fromJSON')
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {unknown} raw
|
||||
* @return {raw is RawEditOperation}
|
||||
*/
|
||||
static isValid(raw) {
|
||||
return (
|
||||
isTextOperation(raw) ||
|
||||
isRawAddCommentOperation(raw) ||
|
||||
isRawDeleteCommentOperation(raw) ||
|
||||
isRawSetCommentStateOperation(raw) ||
|
||||
isRawEditNoOperation(raw)
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
|
@ -13,7 +13,7 @@ let EditFileOperation = null
|
|||
let SetFileMetadataOperation = null
|
||||
|
||||
/**
|
||||
* @import { ReadonlyBlobStore } from "../types"
|
||||
* @import { BlobStore } from "../types"
|
||||
* @import Snapshot from "../snapshot"
|
||||
*/
|
||||
|
||||
|
@ -80,7 +80,7 @@ class Operation {
|
|||
* If this operation references any files, load the files.
|
||||
*
|
||||
* @param {string} kind see {File#load}
|
||||
* @param {ReadOnlyBlobStore} blobStore
|
||||
* @param {BlobStore} blobStore
|
||||
* @return {Promise<void>}
|
||||
*/
|
||||
async loadFiles(kind, blobStore) {}
|
||||
|
|
|
@ -175,7 +175,7 @@ class InsertOp extends ScanOp {
|
|||
return false
|
||||
}
|
||||
if (this.tracking) {
|
||||
if (!other.tracking || !this.tracking.canMergeWith(other.tracking)) {
|
||||
if (!this.tracking.equals(other.tracking)) {
|
||||
return false
|
||||
}
|
||||
} else if (other.tracking) {
|
||||
|
@ -198,10 +198,7 @@ class InsertOp extends ScanOp {
|
|||
throw new Error('Cannot merge with incompatible operation')
|
||||
}
|
||||
this.insertion += other.insertion
|
||||
if (this.tracking != null && other.tracking != null) {
|
||||
this.tracking = this.tracking.mergeWith(other.tracking)
|
||||
}
|
||||
// We already have the same commentIds
|
||||
// We already have the same tracking info and commentIds
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -309,13 +306,9 @@ class RetainOp extends ScanOp {
|
|||
return false
|
||||
}
|
||||
if (this.tracking) {
|
||||
if (!other.tracking || !this.tracking.canMergeWith(other.tracking)) {
|
||||
return false
|
||||
}
|
||||
} else if (other.tracking) {
|
||||
return false
|
||||
return this.tracking.equals(other.tracking)
|
||||
}
|
||||
return true
|
||||
return !other.tracking
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -326,9 +319,6 @@ class RetainOp extends ScanOp {
|
|||
throw new Error('Cannot merge with incompatible operation')
|
||||
}
|
||||
this.length += other.length
|
||||
if (this.tracking != null && other.tracking != null) {
|
||||
this.tracking = this.tracking.mergeWith(other.tracking)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
|
@ -56,34 +56,18 @@ class TextOperation extends EditOperation {
|
|||
|
||||
constructor() {
|
||||
super()
|
||||
|
||||
/**
|
||||
* When an operation is applied to an input string, you can think of this as
|
||||
* if an imaginary cursor runs over the entire string and skips over some
|
||||
* parts, removes some parts and inserts characters at some positions. These
|
||||
* actions (skip/remove/insert) are stored as an array in the "ops" property.
|
||||
* @type {ScanOp[]}
|
||||
*/
|
||||
// When an operation is applied to an input string, you can think of this as
|
||||
// if an imaginary cursor runs over the entire string and skips over some
|
||||
// parts, removes some parts and inserts characters at some positions. These
|
||||
// actions (skip/remove/insert) are stored as an array in the "ops" property.
|
||||
/** @type {ScanOp[]} */
|
||||
this.ops = []
|
||||
|
||||
/**
|
||||
* An operation's baseLength is the length of every string the operation
|
||||
* can be applied to.
|
||||
*/
|
||||
// An operation's baseLength is the length of every string the operation
|
||||
// can be applied to.
|
||||
this.baseLength = 0
|
||||
|
||||
/**
|
||||
* The targetLength is the length of every string that results from applying
|
||||
* the operation on a valid input string.
|
||||
*/
|
||||
// The targetLength is the length of every string that results from applying
|
||||
// the operation on a valid input string.
|
||||
this.targetLength = 0
|
||||
|
||||
/**
|
||||
* The expected content hash after this operation is applied
|
||||
*
|
||||
* @type {string | null}
|
||||
*/
|
||||
this.contentHash = null
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -239,12 +223,7 @@ class TextOperation extends EditOperation {
|
|||
* @returns {RawTextOperation}
|
||||
*/
|
||||
toJSON() {
|
||||
/** @type {RawTextOperation} */
|
||||
const json = { textOperation: this.ops.map(op => op.toJSON()) }
|
||||
if (this.contentHash != null) {
|
||||
json.contentHash = this.contentHash
|
||||
}
|
||||
return json
|
||||
return { textOperation: this.ops.map(op => op.toJSON()) }
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -252,7 +231,7 @@ class TextOperation extends EditOperation {
|
|||
* @param {RawTextOperation} obj
|
||||
* @returns {TextOperation}
|
||||
*/
|
||||
static fromJSON = function ({ textOperation: ops, contentHash }) {
|
||||
static fromJSON = function ({ textOperation: ops }) {
|
||||
const o = new TextOperation()
|
||||
for (const op of ops) {
|
||||
if (isRetain(op)) {
|
||||
|
@ -271,9 +250,6 @@ class TextOperation extends EditOperation {
|
|||
throw new UnprocessableError('unknown operation: ' + JSON.stringify(op))
|
||||
}
|
||||
}
|
||||
if (contentHash != null) {
|
||||
o.contentHash = contentHash
|
||||
}
|
||||
return o
|
||||
}
|
||||
|
||||
|
@ -314,18 +290,25 @@ class TextOperation extends EditOperation {
|
|||
str
|
||||
)
|
||||
}
|
||||
file.trackedChanges.applyRetain(result.length, op.length, {
|
||||
tracking: op.tracking,
|
||||
})
|
||||
result += str.slice(inputCursor, inputCursor + op.length)
|
||||
inputCursor += op.length
|
||||
} else if (op instanceof InsertOp) {
|
||||
if (containsNonBmpChars(op.insertion)) {
|
||||
throw new InvalidInsertionError(str, op.toJSON())
|
||||
}
|
||||
file.trackedChanges.applyInsert(result.length, op.insertion, {
|
||||
tracking: op.tracking,
|
||||
})
|
||||
file.comments.applyInsert(
|
||||
new Range(result.length, op.insertion.length),
|
||||
{ commentIds: op.commentIds }
|
||||
)
|
||||
result += op.insertion
|
||||
} else if (op instanceof RemoveOp) {
|
||||
file.trackedChanges.applyDelete(result.length, op.length)
|
||||
file.comments.applyDelete(new Range(result.length, op.length))
|
||||
inputCursor += op.length
|
||||
} else {
|
||||
|
@ -345,8 +328,6 @@ class TextOperation extends EditOperation {
|
|||
throw new TextOperation.TooLongError(operation, result.length)
|
||||
}
|
||||
|
||||
file.trackedChanges.applyTextOperation(this)
|
||||
|
||||
file.content = result
|
||||
}
|
||||
|
||||
|
@ -395,36 +376,44 @@ class TextOperation extends EditOperation {
|
|||
for (let i = 0, l = ops.length; i < l; i++) {
|
||||
const op = ops[i]
|
||||
if (op instanceof RetainOp) {
|
||||
if (op.tracking) {
|
||||
// Where we need to end up after the retains
|
||||
const target = strIndex + op.length
|
||||
// A previous retain could have overriden some tracking info. Now we
|
||||
// need to restore it.
|
||||
const previousChanges = previousState.trackedChanges.intersectRange(
|
||||
new Range(strIndex, op.length)
|
||||
)
|
||||
// Where we need to end up after the retains
|
||||
const target = strIndex + op.length
|
||||
// A previous retain could have overriden some tracking info. Now we
|
||||
// need to restore it.
|
||||
const previousRanges = previousState.trackedChanges.inRange(
|
||||
new Range(strIndex, op.length)
|
||||
)
|
||||
|
||||
for (const change of previousChanges) {
|
||||
if (strIndex < change.range.start) {
|
||||
inverse.retain(change.range.start - strIndex, {
|
||||
tracking: new ClearTrackingProps(),
|
||||
})
|
||||
strIndex = change.range.start
|
||||
}
|
||||
inverse.retain(change.range.length, {
|
||||
tracking: change.tracking,
|
||||
let removeTrackingInfoIfNeeded
|
||||
if (op.tracking) {
|
||||
removeTrackingInfoIfNeeded = new ClearTrackingProps()
|
||||
}
|
||||
|
||||
for (const trackedChange of previousRanges) {
|
||||
if (strIndex < trackedChange.range.start) {
|
||||
inverse.retain(trackedChange.range.start - strIndex, {
|
||||
tracking: removeTrackingInfoIfNeeded,
|
||||
})
|
||||
strIndex += change.range.length
|
||||
strIndex = trackedChange.range.start
|
||||
}
|
||||
if (strIndex < target) {
|
||||
inverse.retain(target - strIndex, {
|
||||
tracking: new ClearTrackingProps(),
|
||||
if (trackedChange.range.end < strIndex + op.length) {
|
||||
inverse.retain(trackedChange.range.length, {
|
||||
tracking: trackedChange.tracking,
|
||||
})
|
||||
strIndex = target
|
||||
strIndex = trackedChange.range.end
|
||||
}
|
||||
} else {
|
||||
inverse.retain(op.length)
|
||||
strIndex += op.length
|
||||
if (trackedChange.range.end !== strIndex) {
|
||||
// No need to split the range at the end
|
||||
const [left] = trackedChange.range.splitAt(strIndex)
|
||||
inverse.retain(left.length, { tracking: trackedChange.tracking })
|
||||
strIndex = left.end
|
||||
}
|
||||
}
|
||||
if (strIndex < target) {
|
||||
inverse.retain(target - strIndex, {
|
||||
tracking: removeTrackingInfoIfNeeded,
|
||||
})
|
||||
strIndex = target
|
||||
}
|
||||
} else if (op instanceof InsertOp) {
|
||||
inverse.remove(op.insertion.length)
|
||||
|
|
|
@ -86,32 +86,10 @@ class Range {
|
|||
}
|
||||
|
||||
/**
|
||||
* Does this range overlap another range?
|
||||
*
|
||||
* Overlapping means that the two ranges have at least one character in common
|
||||
*
|
||||
* @param {Range} other - the other range
|
||||
* @param {Range} range
|
||||
*/
|
||||
overlaps(other) {
|
||||
return this.start < other.end && this.end > other.start
|
||||
}
|
||||
|
||||
/**
|
||||
* Does this range overlap the start of another range?
|
||||
*
|
||||
* @param {Range} other - the other range
|
||||
*/
|
||||
overlapsStart(other) {
|
||||
return this.start <= other.start && this.end > other.start
|
||||
}
|
||||
|
||||
/**
|
||||
* Does this range overlap the end of another range?
|
||||
*
|
||||
* @param {Range} other - the other range
|
||||
*/
|
||||
overlapsEnd(other) {
|
||||
return this.start < other.end && this.end >= other.end
|
||||
overlaps(range) {
|
||||
return this.start < range.end && this.end > range.start
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -249,26 +227,6 @@ class Range {
|
|||
)
|
||||
return [rangeUpToCursor, rangeAfterCursor]
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the intersection of this range with another range
|
||||
*
|
||||
* @param {Range} other - the other range
|
||||
* @return {Range | null} the intersection or null if the intersection is empty
|
||||
*/
|
||||
intersect(other) {
|
||||
if (this.contains(other)) {
|
||||
return other
|
||||
} else if (other.contains(this)) {
|
||||
return this
|
||||
} else if (other.overlapsStart(this)) {
|
||||
return new Range(this.pos, other.end - this.start)
|
||||
} else if (other.overlapsEnd(this)) {
|
||||
return new Range(other.pos, this.end - other.start)
|
||||
} else {
|
||||
return null
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = Range
|
||||
|
|
|
@ -64,57 +64,17 @@ function cleanPart(filename) {
|
|||
* @return {String}
|
||||
*/
|
||||
exports.clean = function (pathname) {
|
||||
return exports.cleanDebug(pathname)[0]
|
||||
}
|
||||
|
||||
/**
|
||||
* See clean
|
||||
* @param {string} pathname
|
||||
* @return {[string,string]}
|
||||
*/
|
||||
exports.cleanDebug = function (pathname) {
|
||||
let prev = pathname
|
||||
let reason = ''
|
||||
|
||||
/**
|
||||
* @param {string} label
|
||||
*/
|
||||
function recordReasonIfChanged(label) {
|
||||
if (pathname === prev) return
|
||||
if (reason) reason += ','
|
||||
reason += label
|
||||
prev = pathname
|
||||
}
|
||||
pathname = path.normalize(pathname)
|
||||
recordReasonIfChanged('normalize')
|
||||
|
||||
pathname = pathname.replace(/\\/g, '/')
|
||||
recordReasonIfChanged('workaround for IE')
|
||||
|
||||
pathname = pathname.replace(/\/+/g, '/')
|
||||
recordReasonIfChanged('no multiple slashes')
|
||||
|
||||
pathname = pathname.replace(/^(\/.*)$/, '_$1')
|
||||
recordReasonIfChanged('no leading /')
|
||||
|
||||
pathname = pathname.replace(/^(.+)\/$/, '$1')
|
||||
recordReasonIfChanged('no trailing /')
|
||||
|
||||
pathname = pathname.replace(/^ *(.*)$/, '$1')
|
||||
recordReasonIfChanged('no leading spaces')
|
||||
|
||||
pathname = pathname.replace(/^(.*[^ ]) *$/, '$1')
|
||||
recordReasonIfChanged('no trailing spaces')
|
||||
|
||||
pathname = pathname.replace(/\\/g, '/') // workaround for IE
|
||||
pathname = pathname.replace(/\/+/g, '/') // no multiple slashes
|
||||
pathname = pathname.replace(/^(\/.*)$/, '_$1') // no leading /
|
||||
pathname = pathname.replace(/^(.+)\/$/, '$1') // no trailing /
|
||||
pathname = pathname.replace(/^ *(.*)$/, '$1') // no leading spaces
|
||||
pathname = pathname.replace(/^(.*[^ ]) *$/, '$1') // no trailing spaces
|
||||
if (pathname.length === 0) pathname = '_'
|
||||
recordReasonIfChanged('empty')
|
||||
|
||||
pathname = pathname.split('/').map(cleanPart).join('/')
|
||||
recordReasonIfChanged('cleanPart')
|
||||
|
||||
pathname = pathname.replace(BLOCKED_FILE_RX, '@$1')
|
||||
recordReasonIfChanged('BLOCKED_FILE_RX')
|
||||
return [pathname, reason]
|
||||
return pathname
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -124,19 +84,9 @@ exports.cleanDebug = function (pathname) {
|
|||
* @return {Boolean}
|
||||
*/
|
||||
exports.isClean = function pathnameIsClean(pathname) {
|
||||
return exports.isCleanDebug(pathname)[0]
|
||||
}
|
||||
|
||||
/**
|
||||
* A pathname is clean (see clean) and not too long.
|
||||
*
|
||||
* @param {string} pathname
|
||||
* @return {[boolean,string]}
|
||||
*/
|
||||
exports.isCleanDebug = function (pathname) {
|
||||
if (pathname.length > MAX_PATH) return [false, 'MAX_PATH']
|
||||
if (pathname.length === 0) return [false, 'empty']
|
||||
const [cleanPathname, reason] = exports.cleanDebug(pathname)
|
||||
if (cleanPathname !== pathname) return [false, reason]
|
||||
return [true, '']
|
||||
return (
|
||||
exports.clean(pathname) === pathname &&
|
||||
pathname.length <= MAX_PATH &&
|
||||
pathname.length > 0
|
||||
)
|
||||
}
|
||||
|
|
|
@ -224,7 +224,7 @@ class Snapshot {
|
|||
*
|
||||
* @param {string} kind see {File#load}
|
||||
* @param {ReadonlyBlobStore} blobStore
|
||||
* @return {Promise<Record<string, File>>} an object where keys are the pathnames and
|
||||
* @return {Promise<Object>} an object where keys are the pathnames and
|
||||
* values are the files in the snapshot
|
||||
*/
|
||||
async loadFiles(kind, blobStore) {
|
||||
|
|
|
@ -132,7 +132,6 @@ export type RawScanOp = RawInsertOp | RawRemoveOp | RawRetainOp
|
|||
|
||||
export type RawTextOperation = {
|
||||
textOperation: RawScanOp[]
|
||||
contentHash?: string
|
||||
}
|
||||
|
||||
export type RawAddCommentOperation = {
|
||||
|
|
|
@ -20,7 +20,7 @@
|
|||
"@types/check-types": "^7.3.7",
|
||||
"@types/path-browserify": "^1.0.2",
|
||||
"chai": "^3.3.0",
|
||||
"mocha": "^11.1.0",
|
||||
"mocha": "^10.2.0",
|
||||
"sinon": "^9.2.4",
|
||||
"typescript": "^5.0.4"
|
||||
},
|
||||
|
|
|
@ -193,13 +193,4 @@ describe('LazyStringFileData', function () {
|
|||
expect(fileData.getStringLength()).to.equal(longString.length)
|
||||
expect(fileData.getOperations()).to.have.length(1)
|
||||
})
|
||||
|
||||
it('truncates its operations after being stored', async function () {
|
||||
const testHash = File.EMPTY_FILE_HASH
|
||||
const fileData = new LazyStringFileData(testHash, undefined, 0)
|
||||
fileData.edit(new TextOperation().insert('abc'))
|
||||
const stored = await fileData.store(this.blobStore)
|
||||
expect(fileData.hash).to.equal(stored.hash)
|
||||
expect(fileData.operations).to.deep.equal([])
|
||||
})
|
||||
})
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
// @ts-check
|
||||
'use strict'
|
||||
|
||||
const { expect } = require('chai')
|
||||
|
@ -448,44 +449,4 @@ describe('Range', function () {
|
|||
expect(() => range.insertAt(16, 3)).to.throw()
|
||||
})
|
||||
})
|
||||
|
||||
describe('intersect', function () {
|
||||
it('should handle partially overlapping ranges', function () {
|
||||
const range1 = new Range(5, 10)
|
||||
const range2 = new Range(3, 6)
|
||||
const intersection1 = range1.intersect(range2)
|
||||
expect(intersection1.pos).to.equal(5)
|
||||
expect(intersection1.length).to.equal(4)
|
||||
const intersection2 = range2.intersect(range1)
|
||||
expect(intersection2.pos).to.equal(5)
|
||||
expect(intersection2.length).to.equal(4)
|
||||
})
|
||||
|
||||
it('should intersect with itself', function () {
|
||||
const range = new Range(5, 10)
|
||||
const intersection = range.intersect(range)
|
||||
expect(intersection.pos).to.equal(5)
|
||||
expect(intersection.length).to.equal(10)
|
||||
})
|
||||
|
||||
it('should handle nested ranges', function () {
|
||||
const range1 = new Range(5, 10)
|
||||
const range2 = new Range(7, 2)
|
||||
const intersection1 = range1.intersect(range2)
|
||||
expect(intersection1.pos).to.equal(7)
|
||||
expect(intersection1.length).to.equal(2)
|
||||
const intersection2 = range2.intersect(range1)
|
||||
expect(intersection2.pos).to.equal(7)
|
||||
expect(intersection2.length).to.equal(2)
|
||||
})
|
||||
|
||||
it('should handle disconnected ranges', function () {
|
||||
const range1 = new Range(5, 10)
|
||||
const range2 = new Range(20, 30)
|
||||
const intersection1 = range1.intersect(range2)
|
||||
expect(intersection1).to.be.null
|
||||
const intersection2 = range2.intersect(range1)
|
||||
expect(intersection2).to.be.null
|
||||
})
|
||||
})
|
||||
})
|
||||
|
|
|
@ -5,11 +5,10 @@ const ot = require('..')
|
|||
const safePathname = ot.safePathname
|
||||
|
||||
describe('safePathname', function () {
|
||||
function expectClean(input, output, reason = '') {
|
||||
function expectClean(input, output) {
|
||||
// check expected output and also idempotency
|
||||
const [cleanedInput, gotReason] = safePathname.cleanDebug(input)
|
||||
const cleanedInput = safePathname.clean(input)
|
||||
expect(cleanedInput).to.equal(output)
|
||||
expect(gotReason).to.equal(reason)
|
||||
expect(safePathname.clean(cleanedInput)).to.equal(cleanedInput)
|
||||
expect(safePathname.isClean(cleanedInput)).to.be.true
|
||||
}
|
||||
|
@ -23,56 +22,44 @@ describe('safePathname', function () {
|
|||
expect(safePathname.isClean('rm -rf /')).to.be.falsy
|
||||
|
||||
// replace invalid characters with underscores
|
||||
expectClean(
|
||||
'test-s*\u0001\u0002m\u0007st\u0008.jpg',
|
||||
'test-s___m_st_.jpg',
|
||||
'cleanPart'
|
||||
)
|
||||
expectClean('test-s*\u0001\u0002m\u0007st\u0008.jpg', 'test-s___m_st_.jpg')
|
||||
|
||||
// keep slashes, normalize paths, replace ..
|
||||
expectClean('./foo', 'foo', 'normalize')
|
||||
expectClean('../foo', '__/foo', 'cleanPart')
|
||||
expectClean('foo/./bar', 'foo/bar', 'normalize')
|
||||
expectClean('foo/../bar', 'bar', 'normalize')
|
||||
expectClean('../../tricky/foo.bar', '__/__/tricky/foo.bar', 'cleanPart')
|
||||
expectClean(
|
||||
'foo/../../tricky/foo.bar',
|
||||
'__/tricky/foo.bar',
|
||||
'normalize,cleanPart'
|
||||
)
|
||||
expectClean('foo/bar/../../tricky/foo.bar', 'tricky/foo.bar', 'normalize')
|
||||
expectClean(
|
||||
'foo/bar/baz/../../tricky/foo.bar',
|
||||
'foo/tricky/foo.bar',
|
||||
'normalize'
|
||||
)
|
||||
expectClean('./foo', 'foo')
|
||||
expectClean('../foo', '__/foo')
|
||||
expectClean('foo/./bar', 'foo/bar')
|
||||
expectClean('foo/../bar', 'bar')
|
||||
expectClean('../../tricky/foo.bar', '__/__/tricky/foo.bar')
|
||||
expectClean('foo/../../tricky/foo.bar', '__/tricky/foo.bar')
|
||||
expectClean('foo/bar/../../tricky/foo.bar', 'tricky/foo.bar')
|
||||
expectClean('foo/bar/baz/../../tricky/foo.bar', 'foo/tricky/foo.bar')
|
||||
|
||||
// remove illegal chars even when there is no extension
|
||||
expectClean('**foo', '__foo', 'cleanPart')
|
||||
expectClean('**foo', '__foo')
|
||||
|
||||
// remove windows file paths
|
||||
expectClean('c:\\temp\\foo.txt', 'c:/temp/foo.txt', 'workaround for IE')
|
||||
expectClean('c:\\temp\\foo.txt', 'c:/temp/foo.txt')
|
||||
|
||||
// do not allow a leading slash (relative paths only)
|
||||
expectClean('/foo', '_/foo', 'no leading /')
|
||||
expectClean('//foo', '_/foo', 'normalize,no leading /')
|
||||
expectClean('/foo', '_/foo')
|
||||
expectClean('//foo', '_/foo')
|
||||
|
||||
// do not allow multiple leading slashes
|
||||
expectClean('//foo', '_/foo', 'normalize,no leading /')
|
||||
expectClean('//foo', '_/foo')
|
||||
|
||||
// do not allow a trailing slash
|
||||
expectClean('/', '_', 'no leading /,no trailing /')
|
||||
expectClean('foo/', 'foo', 'no trailing /')
|
||||
expectClean('foo.tex/', 'foo.tex', 'no trailing /')
|
||||
expectClean('/', '_')
|
||||
expectClean('foo/', 'foo')
|
||||
expectClean('foo.tex/', 'foo.tex')
|
||||
|
||||
// do not allow multiple trailing slashes
|
||||
expectClean('//', '_', 'normalize,no leading /,no trailing /')
|
||||
expectClean('///', '_', 'normalize,no leading /,no trailing /')
|
||||
expectClean('foo//', 'foo', 'normalize,no trailing /')
|
||||
expectClean('//', '_')
|
||||
expectClean('///', '_')
|
||||
expectClean('foo//', 'foo')
|
||||
|
||||
// file and folder names that consist of . and .. are not OK
|
||||
expectClean('.', '_', 'cleanPart')
|
||||
expectClean('..', '__', 'cleanPart')
|
||||
expectClean('.', '_')
|
||||
expectClean('..', '__')
|
||||
// we will allow name with more dots e.g. ... and ....
|
||||
expectClean('...', '...')
|
||||
expectClean('....', '....')
|
||||
|
@ -95,10 +82,10 @@ describe('safePathname', function () {
|
|||
expectClean('a b.png', 'a b.png')
|
||||
|
||||
// leading and trailing spaces are not OK
|
||||
expectClean(' foo', 'foo', 'no leading spaces')
|
||||
expectClean(' foo', 'foo', 'no leading spaces')
|
||||
expectClean('foo ', 'foo', 'no trailing spaces')
|
||||
expectClean('foo ', 'foo', 'no trailing spaces')
|
||||
expectClean(' foo', 'foo')
|
||||
expectClean(' foo', 'foo')
|
||||
expectClean('foo ', 'foo')
|
||||
expectClean('foo ', 'foo')
|
||||
|
||||
// reserved file names on Windows should not be OK, but we already have
|
||||
// some in the old system, so have to allow them for now
|
||||
|
@ -113,14 +100,14 @@ describe('safePathname', function () {
|
|||
// there's no particular reason to allow multiple slashes; sometimes people
|
||||
// seem to rename files to URLs (https://domain/path) in an attempt to
|
||||
// upload a file, and this results in an empty directory name
|
||||
expectClean('foo//bar.png', 'foo/bar.png', 'normalize')
|
||||
expectClean('foo///bar.png', 'foo/bar.png', 'normalize')
|
||||
expectClean('foo//bar.png', 'foo/bar.png')
|
||||
expectClean('foo///bar.png', 'foo/bar.png')
|
||||
|
||||
// Check javascript property handling
|
||||
expectClean('foo/prototype', 'foo/prototype') // OK as part of a pathname
|
||||
expectClean('prototype/test.txt', 'prototype/test.txt')
|
||||
expectClean('prototype', '@prototype', 'BLOCKED_FILE_RX') // not OK as whole pathname
|
||||
expectClean('hasOwnProperty', '@hasOwnProperty', 'BLOCKED_FILE_RX')
|
||||
expectClean('**proto**', '@__proto__', 'cleanPart,BLOCKED_FILE_RX')
|
||||
expectClean('prototype', '@prototype') // not OK as whole pathname
|
||||
expectClean('hasOwnProperty', '@hasOwnProperty')
|
||||
expectClean('**proto**', '@__proto__')
|
||||
})
|
||||
})
|
||||
|
|
|
@ -107,7 +107,7 @@ describe('RetainOp', function () {
|
|||
expect(op1.equals(new RetainOp(3))).to.be.true
|
||||
})
|
||||
|
||||
it('cannot merge with another RetainOp if the tracking user is different', function () {
|
||||
it('cannot merge with another RetainOp if tracking info is different', function () {
|
||||
const op1 = new RetainOp(
|
||||
4,
|
||||
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
|
||||
|
@ -120,14 +120,14 @@ describe('RetainOp', function () {
|
|||
expect(() => op1.mergeWith(op2)).to.throw(Error)
|
||||
})
|
||||
|
||||
it('can merge with another RetainOp if the tracking user is the same', function () {
|
||||
it('can merge with another RetainOp if tracking info is the same', function () {
|
||||
const op1 = new RetainOp(
|
||||
4,
|
||||
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
|
||||
)
|
||||
const op2 = new RetainOp(
|
||||
4,
|
||||
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:01.000Z'))
|
||||
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
|
||||
)
|
||||
op1.mergeWith(op2)
|
||||
expect(
|
||||
|
@ -310,7 +310,7 @@ describe('InsertOp', function () {
|
|||
expect(() => op1.mergeWith(op2)).to.throw(Error)
|
||||
})
|
||||
|
||||
it('cannot merge with another InsertOp if tracking user is different', function () {
|
||||
it('cannot merge with another InsertOp if tracking info is different', function () {
|
||||
const op1 = new InsertOp(
|
||||
'a',
|
||||
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
|
||||
|
@ -323,7 +323,7 @@ describe('InsertOp', function () {
|
|||
expect(() => op1.mergeWith(op2)).to.throw(Error)
|
||||
})
|
||||
|
||||
it('can merge with another InsertOp if tracking user and comment info is the same', function () {
|
||||
it('can merge with another InsertOp if tracking and comment info is the same', function () {
|
||||
const op1 = new InsertOp(
|
||||
'a',
|
||||
new TrackingProps(
|
||||
|
@ -338,7 +338,7 @@ describe('InsertOp', function () {
|
|||
new TrackingProps(
|
||||
'insert',
|
||||
'user1',
|
||||
new Date('2024-01-01T00:00:01.000Z')
|
||||
new Date('2024-01-01T00:00:00.000Z')
|
||||
),
|
||||
['1', '2']
|
||||
)
|
||||
|
|
|
@ -322,47 +322,6 @@ describe('TextOperation', function () {
|
|||
new TextOperation().retain(4).remove(4).retain(3)
|
||||
)
|
||||
})
|
||||
|
||||
it('undoing a tracked delete restores the tracked changes', function () {
|
||||
expectInverseToLeadToInitialState(
|
||||
new StringFileData(
|
||||
'the quick brown fox jumps over the lazy dog',
|
||||
undefined,
|
||||
[
|
||||
{
|
||||
range: { pos: 5, length: 5 },
|
||||
tracking: {
|
||||
ts: '2023-01-01T00:00:00.000Z',
|
||||
type: 'insert',
|
||||
userId: 'user1',
|
||||
},
|
||||
},
|
||||
{
|
||||
range: { pos: 12, length: 3 },
|
||||
tracking: {
|
||||
ts: '2023-01-01T00:00:00.000Z',
|
||||
type: 'delete',
|
||||
userId: 'user1',
|
||||
},
|
||||
},
|
||||
{
|
||||
range: { pos: 18, length: 5 },
|
||||
tracking: {
|
||||
ts: '2023-01-01T00:00:00.000Z',
|
||||
type: 'insert',
|
||||
userId: 'user1',
|
||||
},
|
||||
},
|
||||
]
|
||||
),
|
||||
new TextOperation()
|
||||
.retain(7)
|
||||
.retain(13, {
|
||||
tracking: new TrackingProps('delete', 'user1', new Date()),
|
||||
})
|
||||
.retain(23)
|
||||
)
|
||||
})
|
||||
})
|
||||
|
||||
describe('compose', function () {
|
||||
|
|
1
libraries/promise-utils/.dockerignore
Normal file
1
libraries/promise-utils/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
3
libraries/promise-utils/.gitignore
vendored
Normal file
3
libraries/promise-utils/.gitignore
vendored
Normal file
|
@ -0,0 +1,3 @@
|
|||
|
||||
# managed by monorepo$ bin/update_build_scripts
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
promise-utils
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -13,7 +13,6 @@ module.exports = {
|
|||
expressify,
|
||||
expressifyErrorHandler,
|
||||
promiseMapWithLimit,
|
||||
promiseMapSettledWithLimit,
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -265,19 +264,3 @@ async function promiseMapWithLimit(concurrency, array, fn) {
|
|||
const limit = pLimit(concurrency)
|
||||
return await Promise.all(array.map(x => limit(() => fn(x))))
|
||||
}
|
||||
|
||||
/**
|
||||
* Map values in `array` with the async function `fn`
|
||||
*
|
||||
* Limit the number of unresolved promises to `concurrency`.
|
||||
*
|
||||
* @template T, U
|
||||
* @param {number} concurrency
|
||||
* @param {Array<T>} array
|
||||
* @param {(T) => Promise<U>} fn
|
||||
* @return {Promise<Array<PromiseSettledResult<U>>>}
|
||||
*/
|
||||
function promiseMapSettledWithLimit(concurrency, array, fn) {
|
||||
const limit = pLimit(concurrency)
|
||||
return Promise.allSettled(array.map(x => limit(() => fn(x))))
|
||||
}
|
||||
|
|
|
@ -18,7 +18,7 @@
|
|||
"devDependencies": {
|
||||
"chai": "^4.3.10",
|
||||
"chai-as-promised": "^7.1.1",
|
||||
"mocha": "^11.1.0",
|
||||
"mocha": "^10.2.0",
|
||||
"typescript": "^5.0.4"
|
||||
},
|
||||
"dependencies": {
|
||||
|
|
1
libraries/ranges-tracker/.dockerignore
Normal file
1
libraries/ranges-tracker/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
13
libraries/ranges-tracker/.gitignore
vendored
Normal file
13
libraries/ranges-tracker/.gitignore
vendored
Normal file
|
@ -0,0 +1,13 @@
|
|||
**.swp
|
||||
|
||||
app.js
|
||||
app/js/
|
||||
test/unit/js/
|
||||
public/build/
|
||||
|
||||
node_modules/
|
||||
|
||||
/public/js/chat.js
|
||||
plato/
|
||||
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
ranges-tracker
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -145,7 +145,11 @@ class RangesTracker {
|
|||
}
|
||||
|
||||
removeChangeId(changeId) {
|
||||
this.removeChangeIds([changeId])
|
||||
const change = this.getChange(changeId)
|
||||
if (change == null) {
|
||||
return
|
||||
}
|
||||
this._removeChange(change)
|
||||
}
|
||||
|
||||
removeChangeIds(ids) {
|
||||
|
@ -312,7 +316,7 @@ class RangesTracker {
|
|||
const movedChanges = []
|
||||
const removeChanges = []
|
||||
const newChanges = []
|
||||
const trackedDeletesAtOpPosition = []
|
||||
|
||||
for (let i = 0; i < this.changes.length; i++) {
|
||||
change = this.changes[i]
|
||||
const changeStart = change.op.p
|
||||
|
@ -323,15 +327,13 @@ class RangesTracker {
|
|||
change.op.p += opLength
|
||||
movedChanges.push(change)
|
||||
} else if (opStart === changeStart) {
|
||||
// If we are undoing, then we want to cancel any existing delete ranges if we can.
|
||||
// Check if the insert matches the start of the delete, and just remove it from the delete instead if so.
|
||||
if (
|
||||
!alreadyMerged &&
|
||||
undoing &&
|
||||
change.op.d.length >= op.i.length &&
|
||||
change.op.d.slice(0, op.i.length) === op.i
|
||||
) {
|
||||
// If we are undoing, then we want to reject any existing tracked delete if we can.
|
||||
// Check if the insert matches the start of the delete, and just
|
||||
// remove it from the delete instead if so.
|
||||
change.op.d = change.op.d.slice(op.i.length)
|
||||
change.op.p += op.i.length
|
||||
if (change.op.d === '') {
|
||||
|
@ -340,25 +342,9 @@ class RangesTracker {
|
|||
movedChanges.push(change)
|
||||
}
|
||||
alreadyMerged = true
|
||||
|
||||
// Any tracked delete that came before this tracked delete
|
||||
// rejection was moved after the incoming insert. Move them back
|
||||
// so that they appear before the tracked delete rejection.
|
||||
for (const trackedDelete of trackedDeletesAtOpPosition) {
|
||||
trackedDelete.op.p -= opLength
|
||||
}
|
||||
} else {
|
||||
// We're not rejecting that tracked delete. Move it after the
|
||||
// insert.
|
||||
change.op.p += opLength
|
||||
movedChanges.push(change)
|
||||
|
||||
// Keep track of tracked deletes that are at the same position as the
|
||||
// insert. If we find a tracked delete to reject, we'll want to
|
||||
// reposition them.
|
||||
if (!alreadyMerged) {
|
||||
trackedDeletesAtOpPosition.push(change)
|
||||
}
|
||||
}
|
||||
}
|
||||
} else if (change.op.i != null) {
|
||||
|
@ -638,11 +624,9 @@ class RangesTracker {
|
|||
}
|
||||
|
||||
_addOp(op, metadata) {
|
||||
// Don't take a reference to the existing op since we'll modify this in place with future changes
|
||||
op = this._clone(op)
|
||||
const change = {
|
||||
id: this.newId(),
|
||||
op,
|
||||
op: this._clone(op), // Don't take a reference to the existing op since we'll modify this in place with future changes
|
||||
metadata: this._clone(metadata),
|
||||
}
|
||||
this.changes.push(change)
|
||||
|
@ -665,7 +649,7 @@ class RangesTracker {
|
|||
}
|
||||
|
||||
_removeChange(change) {
|
||||
this.changes = this.changes.filter(c => c !== change)
|
||||
this.changes = this.changes.filter(c => c.id !== change.id)
|
||||
this._markAsDirty(change, 'change', 'removed')
|
||||
}
|
||||
|
||||
|
|
|
@ -20,7 +20,7 @@
|
|||
},
|
||||
"devDependencies": {
|
||||
"chai": "^4.3.6",
|
||||
"mocha": "^11.1.0",
|
||||
"mocha": "^10.2.0",
|
||||
"typescript": "^5.0.4"
|
||||
}
|
||||
}
|
||||
|
|
|
@ -4,7 +4,6 @@ const RangesTracker = require('../..')
|
|||
describe('RangesTracker', function () {
|
||||
describe('with duplicate change ids', function () {
|
||||
beforeEach(function () {
|
||||
this.comments = []
|
||||
this.changes = [
|
||||
{ id: 'id1', op: { p: 1, i: 'hello' } },
|
||||
{ id: 'id2', op: { p: 10, i: 'world' } },
|
||||
|
@ -27,199 +26,4 @@ describe('RangesTracker', function () {
|
|||
expect(this.rangesTracker.changes).to.deep.equal([this.changes[2]])
|
||||
})
|
||||
})
|
||||
|
||||
describe('with duplicate tracked insert ids', function () {
|
||||
beforeEach(function () {
|
||||
this.comments = []
|
||||
this.changes = [
|
||||
{ id: 'id1', op: { p: 10, i: 'one' } },
|
||||
{ id: 'id1', op: { p: 20, i: 'two' } },
|
||||
{ id: 'id1', op: { p: 30, d: 'three' } },
|
||||
]
|
||||
this.rangesTracker = new RangesTracker(this.changes, this.comments)
|
||||
})
|
||||
|
||||
it("deleting one tracked insert doesn't delete the others", function () {
|
||||
this.rangesTracker.applyOp({ p: 20, d: 'two' })
|
||||
expect(this.rangesTracker.changes).to.deep.equal([
|
||||
this.changes[0],
|
||||
this.changes[2],
|
||||
])
|
||||
})
|
||||
})
|
||||
|
||||
describe('with duplicate tracked delete ids', function () {
|
||||
beforeEach(function () {
|
||||
this.comments = []
|
||||
this.changes = [
|
||||
{ id: 'id1', op: { p: 10, d: 'one' } },
|
||||
{ id: 'id1', op: { p: 20, d: 'two' } },
|
||||
{ id: 'id1', op: { p: 30, d: 'three' } },
|
||||
]
|
||||
this.rangesTracker = new RangesTracker(this.changes, this.comments)
|
||||
})
|
||||
|
||||
it('deleting over tracked deletes in tracked changes mode removes the tracked deletes covered', function () {
|
||||
this.rangesTracker.track_changes = true
|
||||
this.rangesTracker.applyOp({
|
||||
p: 15,
|
||||
d: '567890123456789012345',
|
||||
})
|
||||
expect(this.rangesTracker.changes.map(c => c.op)).to.deep.equal([
|
||||
{ p: 10, d: 'one' },
|
||||
{ p: 15, d: '56789two0123456789three012345' },
|
||||
])
|
||||
})
|
||||
|
||||
it('a tracked delete between two tracked deletes joins them into a single tracked delete', function () {
|
||||
this.rangesTracker.track_changes = true
|
||||
this.rangesTracker.applyOp({
|
||||
p: 20,
|
||||
d: '0123456789',
|
||||
})
|
||||
expect(this.rangesTracker.changes.map(c => c.op)).to.deep.equal([
|
||||
{ p: 10, d: 'one' },
|
||||
{ p: 20, d: 'two0123456789three' },
|
||||
])
|
||||
})
|
||||
|
||||
it("rejecting one tracked delete doesn't reject the others", function () {
|
||||
this.rangesTracker.track_changes = true
|
||||
this.rangesTracker.applyOp({
|
||||
p: 20,
|
||||
i: 'two',
|
||||
u: true,
|
||||
})
|
||||
expect(this.rangesTracker.changes.map(c => c.op)).to.deep.equal([
|
||||
{ p: 10, d: 'one' },
|
||||
{ p: 33, d: 'three' },
|
||||
])
|
||||
})
|
||||
|
||||
it("rejecting all tracked deletes doesn't introduce tracked inserts", function () {
|
||||
this.rangesTracker.track_changes = true
|
||||
this.rangesTracker.applyOp({
|
||||
p: 10,
|
||||
i: 'one',
|
||||
u: true,
|
||||
})
|
||||
this.rangesTracker.applyOp({
|
||||
p: 23,
|
||||
i: 'two',
|
||||
u: true,
|
||||
})
|
||||
this.rangesTracker.applyOp({
|
||||
p: 36,
|
||||
i: 'three',
|
||||
u: true,
|
||||
})
|
||||
expect(this.rangesTracker.changes.map(c => c.op)).to.deep.equal([])
|
||||
})
|
||||
})
|
||||
|
||||
describe('with multiple tracked deletes at the same position', function () {
|
||||
beforeEach(function () {
|
||||
this.comments = []
|
||||
this.changes = [
|
||||
{ id: 'id1', op: { p: 33, d: 'before' } },
|
||||
{ id: 'id2', op: { p: 50, d: 'right before' } },
|
||||
{ id: 'id3', op: { p: 50, d: 'this one' } },
|
||||
{ id: 'id4', op: { p: 50, d: 'right after' } },
|
||||
{ id: 'id5', op: { p: 75, d: 'long after' } },
|
||||
]
|
||||
this.rangesTracker = new RangesTracker(this.changes, this.comments)
|
||||
})
|
||||
|
||||
it('preserves the text order when rejecting changes', function () {
|
||||
this.rangesTracker.applyOp(
|
||||
{ p: 50, i: 'this one', u: true },
|
||||
{ user_id: 'user-id' }
|
||||
)
|
||||
expect(this.rangesTracker.changes).to.deep.equal([
|
||||
{ id: 'id1', op: { p: 33, d: 'before' } },
|
||||
{ id: 'id2', op: { p: 50, d: 'right before' } },
|
||||
{ id: 'id4', op: { p: 58, d: 'right after' } },
|
||||
{ id: 'id5', op: { p: 83, d: 'long after' } },
|
||||
])
|
||||
})
|
||||
|
||||
it('moves all tracked deletes after the insert if not rejecting changes', function () {
|
||||
this.rangesTracker.applyOp(
|
||||
{ p: 50, i: 'some other text', u: true, orderedRejections: true },
|
||||
{ user_id: 'user-id' }
|
||||
)
|
||||
expect(this.rangesTracker.changes).to.deep.equal([
|
||||
{ id: 'id1', op: { p: 33, d: 'before' } },
|
||||
{ id: 'id2', op: { p: 65, d: 'right before' } },
|
||||
{ id: 'id3', op: { p: 65, d: 'this one' } },
|
||||
{ id: 'id4', op: { p: 65, d: 'right after' } },
|
||||
{ id: 'id5', op: { p: 90, d: 'long after' } },
|
||||
])
|
||||
})
|
||||
})
|
||||
|
||||
describe('with multiple tracked deletes at the same position with the same content', function () {
|
||||
beforeEach(function () {
|
||||
this.comments = []
|
||||
this.changes = [
|
||||
{ id: 'id1', op: { p: 10, d: 'cat' } },
|
||||
{ id: 'id2', op: { p: 10, d: 'giraffe' } },
|
||||
{ id: 'id3', op: { p: 10, d: 'cat' } },
|
||||
{ id: 'id4', op: { p: 10, d: 'giraffe' } },
|
||||
]
|
||||
this.rangesTracker = new RangesTracker(this.changes, this.comments)
|
||||
})
|
||||
|
||||
it('removes only the first matching tracked delete', function () {
|
||||
this.rangesTracker.applyOp(
|
||||
{ p: 10, i: 'giraffe', u: true },
|
||||
{ user_id: 'user-id' }
|
||||
)
|
||||
expect(this.rangesTracker.changes).to.deep.equal([
|
||||
{ id: 'id1', op: { p: 10, d: 'cat' } },
|
||||
{ id: 'id3', op: { p: 17, d: 'cat' } },
|
||||
{ id: 'id4', op: { p: 17, d: 'giraffe' } },
|
||||
])
|
||||
})
|
||||
})
|
||||
|
||||
describe('with a tracked insert at the same position as a tracked delete', function () {
|
||||
beforeEach(function () {
|
||||
this.comments = []
|
||||
this.changes = [
|
||||
{
|
||||
id: 'id1',
|
||||
op: { p: 5, d: 'before' },
|
||||
metadata: { user_id: 'user-id' },
|
||||
},
|
||||
{
|
||||
id: 'id2',
|
||||
op: { p: 10, d: 'delete' },
|
||||
metadata: { user_id: 'user-id' },
|
||||
},
|
||||
{
|
||||
id: 'id3',
|
||||
op: { p: 10, i: 'insert' },
|
||||
metadata: { user_id: 'user-id' },
|
||||
},
|
||||
]
|
||||
this.rangesTracker = new RangesTracker(this.changes, this.comments)
|
||||
})
|
||||
|
||||
it('places a tracked insert at the same position before both the delete and the insert', function () {
|
||||
this.rangesTracker.track_changes = true
|
||||
this.rangesTracker.applyOp(
|
||||
{ p: 10, i: 'incoming' },
|
||||
{ user_id: 'user-id' }
|
||||
)
|
||||
expect(this.rangesTracker.changes.map(change => change.op)).to.deep.equal(
|
||||
[
|
||||
{ p: 5, d: 'before' },
|
||||
{ p: 10, i: 'incoming' },
|
||||
{ p: 18, d: 'delete' },
|
||||
{ p: 18, i: 'insert' },
|
||||
]
|
||||
)
|
||||
})
|
||||
})
|
||||
})
|
||||
|
|
1
libraries/redis-wrapper/.dockerignore
Normal file
1
libraries/redis-wrapper/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
13
libraries/redis-wrapper/.gitignore
vendored
Normal file
13
libraries/redis-wrapper/.gitignore
vendored
Normal file
|
@ -0,0 +1,13 @@
|
|||
**.swp
|
||||
|
||||
app.js
|
||||
app/js/
|
||||
test/unit/js/
|
||||
public/build/
|
||||
|
||||
node_modules/
|
||||
|
||||
/public/js/chat.js
|
||||
plato/
|
||||
|
||||
.npmrc
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue