mirror of
https://github.com/yu-i-i/overleaf-cep.git
synced 2025-07-26 05:00:06 +02:00
Compare commits
11 commits
ext-ce
...
v5.3.1-ext
Author | SHA1 | Date | |
---|---|---|---|
![]() |
9f39fd05e1 | ||
![]() |
2d1c61650a | ||
![]() |
91bebe3cbf | ||
![]() |
1e407e34e5 | ||
![]() |
5d2febba7d | ||
![]() |
83719f84c2 | ||
![]() |
1fbbfe176f | ||
![]() |
e992cfca64 | ||
![]() |
b748a664a8 | ||
![]() |
a7ddd99339 | ||
![]() |
24fb73810d |
3081 changed files with 131147 additions and 151766 deletions
|
@ -1,19 +1,10 @@
|
|||
---
|
||||
name: Bug report
|
||||
about: Report a bug
|
||||
title: ''
|
||||
labels: type:bug
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
<!--
|
||||
|
||||
Note: If you are using www.overleaf.com and have a problem,
|
||||
Note: If you are using www.overleaf.com and have a problem,
|
||||
or if you would like to request a new feature please contact
|
||||
the support team at support@overleaf.com
|
||||
|
||||
This form should only be used to report bugs in the
|
||||
|
||||
This form should only be used to report bugs in the
|
||||
Community Edition release of Overleaf.
|
||||
|
||||
-->
|
862
README.md
862
README.md
|
@ -27,7 +27,6 @@
|
|||
|
||||
The present "extended" version of Overleaf CE includes:
|
||||
|
||||
- Template Gallery
|
||||
- Sandboxed Compiles with TeX Live image selection
|
||||
- LDAP authentication
|
||||
- SAML authentication
|
||||
|
@ -35,13 +34,6 @@ The present "extended" version of Overleaf CE includes:
|
|||
- Real-time track changes and comments
|
||||
- Autocomplete of reference keys
|
||||
- Symbol Palette
|
||||
- "From External URL" feature
|
||||
|
||||
> [!CAUTION]
|
||||
> Overleaf Community Edition is intended for use in environments where **all** users are trusted. Community Edition is **not** appropriate for scenarios where isolation of users is required due to Sandbox Compiles not being available. When not using Sandboxed Compiles, users have full read and write access to the `sharelatex` container resources (filesystem, network, environment variables) when running LaTeX compiles.
|
||||
Therefore, in any environment where not all users can be fully trusted, it is strongly recommended to enable the Sandboxed Compiles feature available in the Extended Community Edition.
|
||||
|
||||
For more information on Sandbox Compiles check out Overleaf [documentation](https://docs.overleaf.com/on-premises/configuration/overleaf-toolkit/server-pro-only-configuration/sandboxed-compiles).
|
||||
|
||||
## Enterprise
|
||||
|
||||
|
@ -50,16 +42,847 @@ If you want help installing and maintaining Overleaf in your lab or workplace, O
|
|||
## Installation
|
||||
|
||||
Detailed installation instructions can be found in the [Overleaf Toolkit](https://github.com/overleaf/toolkit/).
|
||||
Configuration details and release history for the Extended Community Edition can be found on the [Extended CE Wiki Page](https://github.com/yu-i-i/overleaf-cep/wiki).
|
||||
To run a custom image, add a file named docker-compose.override.yml with the following or similar content into the `overleaf-toolkit/config directory`:
|
||||
|
||||
```yml
|
||||
---
|
||||
version: '2.2'
|
||||
services:
|
||||
sharelatex:
|
||||
image: sharelatex/sharelatex:ext-ce
|
||||
volumes:
|
||||
- ../config/certs:/overleaf/certs
|
||||
```
|
||||
Here, the attached volume provides convenient access for the container to the certificates needed for SAML or LDAP authentication.
|
||||
|
||||
If you want to build a Docker image of the extended CE based on the upstream v5.3.1 codebase, you can check out the corresponding tag by running:
|
||||
```
|
||||
git checkout v5.3.1-ext-v1
|
||||
```
|
||||
After building the image, switch to the latest state of the repository and check the `server-ce/hotfix` directory. If a subdirectory matching your version (e.g., `5.3.1`) exists, build a patched image.
|
||||
Alternatively, you can download a prebuilt image from Docker Hub:
|
||||
```
|
||||
docker pull overleafcep/sharelatex:5.3.1-ext-v1
|
||||
```
|
||||
Make sure to update the image name in overleaf-toolkit/config/docker-compose.override.yml accordingly.
|
||||
|
||||
## Sandboxed Compiles
|
||||
|
||||
To enable sandboxed compiles (also known as "Sibling containers"), set the following configuration options in `overleaf-toolkit/config/overleaf.rc`:
|
||||
|
||||
```
|
||||
SERVER_PRO=true
|
||||
SIBLING_CONTAINERS_ENABLED=true
|
||||
```
|
||||
|
||||
The following environment variables are used to specify which TeX Live images to use for sandboxed compiles:
|
||||
|
||||
- `ALL_TEX_LIVE_DOCKER_IMAGES` **(required)**
|
||||
* A comma-separated list of TeX Live images to use. These images will be downloaded or updated.
|
||||
To skip downloading the images, set `SIBLING_CONTAINERS_PULL=false` in `config/overleaf.rc`.
|
||||
- `ALL_TEX_LIVE_DOCKER_IMAGE_NAMES`
|
||||
* A comma-separated list of friendly names for the images. If omitted, the version name will be used (e.g., `latest-full`).
|
||||
- `TEX_LIVE_DOCKER_IMAGE` **(required)**
|
||||
* The default TeX Live image that will be used for compiling new projects. The environment variable `ALL_TEX_LIVE_DOCKER_IMAGES` must include this image.
|
||||
|
||||
Users can select the image for their project in the project menu.
|
||||
|
||||
Here is an example where the default TeX Live image is `latest-full` from Docker Hub, but the `TL2023-historic` image can be used for older projects:
|
||||
```
|
||||
ALL_TEX_LIVE_DOCKER_IMAGES=texlive/texlive:latest-full, texlive/texlive:TL2023-historic
|
||||
ALL_TEX_LIVE_DOCKER_IMAGE_NAMES=TeXLive 2024, TeXLive 2023
|
||||
TEX_LIVE_DOCKER_IMAGE=texlive/texlive:latest-full
|
||||
```
|
||||
For additional details refer to
|
||||
[Server Pro: Sandboxed Compiles](https://github.com/overleaf/overleaf/wiki/Server-Pro:-Sandboxed-Compiles) and
|
||||
[Toolkit: Sandboxed Compiles](https://github.com/overleaf/toolkit/blob/master/doc/sandboxed-compiles.md).
|
||||
|
||||
When the compilation takes place in a dedicated container, it is relatively safe to permit running external commands from inside the TeX
|
||||
file during compilation. This is required for packages like `minted`. For this purpose, the following environment variable can be used:
|
||||
|
||||
- `TEX_COMPILER_EXTRA_FLAGS`
|
||||
* A list of extra flags for TeX compiler. Example: `-shell-escape -file-line-error`
|
||||
|
||||
<details>
|
||||
<summary><h4>Sample variables.env file</h4></summary>
|
||||
|
||||
```
|
||||
OVERLEAF_APP_NAME="Our Overleaf Instance"
|
||||
|
||||
ENABLED_LINKED_FILE_TYPES=project_file,project_output_file
|
||||
|
||||
# Enables Thumbnail generation using ImageMagick
|
||||
ENABLE_CONVERSIONS=true
|
||||
|
||||
# Disables email confirmation requirement
|
||||
EMAIL_CONFIRMATION_DISABLED=true
|
||||
|
||||
## Nginx
|
||||
# NGINX_WORKER_PROCESSES=4
|
||||
# NGINX_WORKER_CONNECTIONS=768
|
||||
|
||||
## Set for TLS via nginx-proxy
|
||||
# OVERLEAF_BEHIND_PROXY=true
|
||||
# OVERLEAF_SECURE_COOKIE=true
|
||||
|
||||
OVERLEAF_SITE_URL=http://my-overleaf-instance.com
|
||||
OVERLEAF_NAV_TITLE=Our Overleaf Instance
|
||||
# OVERLEAF_HEADER_IMAGE_URL=http://somewhere.com/mylogo.png
|
||||
OVERLEAF_ADMIN_EMAIL=support@example.com
|
||||
|
||||
OVERLEAF_LEFT_FOOTER=[{"text": "Contact your support team", "url": "mailto:support@example.com"}]
|
||||
OVERLEAF_RIGHT_FOOTER=[{"text":"Hello, I am on the Right", "url":"https://github.com/yu-i-i/overleaf-cep"}]
|
||||
|
||||
OVERLEAF_EMAIL_FROM_ADDRESS=team@example.com
|
||||
OVERLEAF_EMAIL_SMTP_HOST=smtp.example.com
|
||||
OVERLEAF_EMAIL_SMTP_PORT=587
|
||||
OVERLEAF_EMAIL_SMTP_SECURE=false
|
||||
# OVERLEAF_EMAIL_SMTP_USER=
|
||||
# OVERLEAF_EMAIL_SMTP_PASS=
|
||||
# OVERLEAF_EMAIL_SMTP_NAME=
|
||||
OVERLEAF_EMAIL_SMTP_LOGGER=false
|
||||
OVERLEAF_EMAIL_SMTP_TLS_REJECT_UNAUTH=true
|
||||
OVERLEAF_EMAIL_SMTP_IGNORE_TLS=false
|
||||
OVERLEAF_CUSTOM_EMAIL_FOOTER=This system is run by department x
|
||||
|
||||
OVERLEAF_PROXY_LEARN=true
|
||||
NAV_HIDE_POWERED_BY=true
|
||||
|
||||
########################
|
||||
## Sandboxed Compiles ##
|
||||
########################
|
||||
|
||||
ALL_TEX_LIVE_DOCKER_IMAGES=texlive/texlive:latest-full, texlive/texlive:TL2023-historic
|
||||
ALL_TEX_LIVE_DOCKER_IMAGE_NAMES=TeXLive 2024, TeXLive 2023
|
||||
TEX_LIVE_DOCKER_IMAGE=texlive/texlive:latest-full
|
||||
TEX_COMPILER_EXTRA_FLAGS=-shell-escape
|
||||
```
|
||||
</details>
|
||||
|
||||
## Authentication Methods
|
||||
|
||||
The following authentication methods are supported: local authentication, LDAP authentication, SAML authentication,
|
||||
and OpenID Connect (OIDC) authentication. Local authentication is always active. The environment variable `EXTERNAL_AUTH`
|
||||
specifies which external authentication methods are activated. The value of this variable is a list. If the list includes `ldap`, `saml`, or `oidc`,
|
||||
then LDAP authentication, SAML authentication, and OIDC authentication will be activated, respectively.
|
||||
|
||||
For example: `EXTERNAL_AUTH=ldap saml oidc`
|
||||
|
||||
This configuration activates all available authentication methods, although this is rarely necessary.
|
||||
|
||||
<details>
|
||||
<summary><h3>Local Authentication</h3></summary>
|
||||
|
||||
Password of local users stored in the MongoDB database. An admin user can create a new local user. For details, visit the
|
||||
[wiki of Overleaf project](https://github.com/overleaf/overleaf/wiki/Creating-and-managing-users).
|
||||
|
||||
It is possible to enforce password restrictions on local users:
|
||||
|
||||
|
||||
* `OVERLEAF_PASSWORD_VALIDATION_MIN_LENGTH`: The minimum length required
|
||||
|
||||
* `OVERLEAF_PASSWORD_VALIDATION_MAX_LENGTH`: The maximum length allowed
|
||||
|
||||
* `OVERLEAF_PASSWORD_VALIDATION_PATTERN`: is used to validate password strength
|
||||
|
||||
- `abc123` – password requires 3 letters and 3 numbers and be at least 6 characters long
|
||||
- `aA` – password requires lower and uppercase letters and be at least 2 characters long
|
||||
- `ab$3` – it must contain letters, digits and symbols and be at least 4 characters long
|
||||
- There are 4 groups of characters: letters, UPPERcase letters, digits, symbols. Anything that is neither a letter nor a digit is considered to be a symbol.
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><h3>LDAP Authentication</h3></summary>
|
||||
|
||||
Internally, Overleaf LDAP uses the [passport-ldapauth](https://github.com/vesse/passport-ldapauth) library. Most of these configuration options are passed through to the `server` config object which is used to configure `passport-ldapauth`. If you are having issues configuring LDAP, it is worth reading the README for `passport-ldapauth` to understand the configuration it expects.
|
||||
|
||||
When using Local and LDAP authentication methods, a user enters a `username` and `password` in the login form. If LDAP authentication is enabled, it is attempted first:
|
||||
|
||||
1. An LDAP user is searched for in the LDAP directory using the filter defined by `OVERLEAF_LDAP_SEARCH_FILTER` and authenticated.
|
||||
2. If authentication is successful, the Overleaf users database is checked for a user with the primary email address that matches the email address of the authenticated LDAP user:
|
||||
- If a matching user is found, the `hashedPassword` field for this user is deleted (if it exists). This ensures that the user can only log in via LDAP authentication in the future.
|
||||
- If no matching user is found, a new Overleaf user is created using the email, first name, and last name retrieved from the LDAP server.
|
||||
3. If LDAP authentication fails or is unsuccessful, local authentication is attempted.
|
||||
|
||||
#### Environment Variables
|
||||
|
||||
- `OVERLEAF_LDAP_URL` **(required)**
|
||||
* URL of the LDAP server.
|
||||
|
||||
- Example: `ldaps://ldap.example.com:636` (LDAP over SSL)
|
||||
- Example: `ldap://ldap.example.com:389` (unencrypted or STARTTLS, if configured).
|
||||
|
||||
- `OVERLEAF_LDAP_EMAIL_ATT`
|
||||
* The email attribute returned by the LDAP server, default `mail`. Each LDAP user must have at least one email address.
|
||||
If multiple addresses are provided, only the first one will be used.
|
||||
|
||||
- `OVERLEAF_LDAP_FIRST_NAME_ATT`
|
||||
* The property name holding the first name of the user which is used in the application, usually `givenName`.
|
||||
|
||||
- `OVERLEAF_LDAP_LAST_NAME_ATT`
|
||||
* The property name holding the family name of the user which is used in the application, usually `sn`.
|
||||
|
||||
- `OVERLEAF_LDAP_NAME_ATT`
|
||||
* The property name holding the full name of the user, usually `cn`. If either of the two previous variables is not defined,
|
||||
the first and/or last name of the user is extracted from this variable. Otherwise, it is not used.
|
||||
|
||||
- `OVERLEAF_LDAP_PLACEHOLDER`
|
||||
* The placeholder for the login form, defaults to `Username`.
|
||||
|
||||
- `OVERLEAF_LDAP_UPDATE_USER_DETAILS_ON_LOGIN`
|
||||
* If set to `true`, updates the LDAP user `first_name` and `last_name` field on login, and turn off the user details form on the `/user/settings`
|
||||
page for LDAP users. Otherwise, details will be fetched only on first login.
|
||||
|
||||
- `OVERLEAF_LDAP_BIND_DN`
|
||||
* The distinguished name of the LDAP user that should be used for the LDAP connection
|
||||
(this user should be able to search/list accounts on the LDAP server),
|
||||
e.g., `cn=ldap_reader,dc=example,dc=com`. If not defined, anonymous binding is used.
|
||||
|
||||
- `OVERLEAF_LDAP_BIND_CREDENTIALS`
|
||||
* Password for `OVERLEAF_LDAP_BIND_DN`.
|
||||
|
||||
- `OVERLEAF_LDAP_BIND_PROPERTY`
|
||||
* Property of the user to bind against the client, defaults to `dn`.
|
||||
|
||||
- `OVERLEAF_LDAP_SEARCH_BASE` **(required)**
|
||||
* The base DN from which to search for users. E.g., `ou=people,dc=example,dc=com`.
|
||||
|
||||
- `OVERLEAF_LDAP_SEARCH_FILTER`
|
||||
* LDAP search filter with which to find a user. Use the literal '{{username}}' to have the given username be interpolated in for the LDAP search.
|
||||
|
||||
- Example: `(|(uid={{username}})(mail={{username}}))` (user can login with email or with login name).
|
||||
- Example: `(sAMAccountName={{username}})` (Active Directory).
|
||||
|
||||
- `OVERLEAF_LDAP_SEARCH_SCOPE`
|
||||
* The scope of the search can be `base`, `one`, or `sub` (default).
|
||||
|
||||
- `OVERLEAF_LDAP_SEARCH_ATTRIBUTES`
|
||||
* JSON array of attributes to fetch from the LDAP server, e.g., `["uid", "mail", "givenName", "sn"]`.
|
||||
By default, all attributes are fetched.
|
||||
|
||||
- `OVERLEAF_LDAP_STARTTLS`
|
||||
* If `true`, LDAP over TLS is used.
|
||||
|
||||
- `OVERLEAF_LDAP_TLS_OPTS_CA_PATH`
|
||||
* Path to the file containing the CA certificate used to verify the LDAP server's SSL/TLS certificate. If there are multiple certificates, then
|
||||
it can be a JSON array of paths to the certificates. The files must be accessible to the docker container.
|
||||
|
||||
- Example (one certificate): `/overleaf/certs/ldap_ca_cert.pem`
|
||||
- Example (multiple certificates): `["/overleaf/certs/ldap_ca_cert1.pem", "/overleaf/certs/ldap_ca_cert2.pem"]`
|
||||
|
||||
- `OVERLEAF_LDAP_TLS_OPTS_REJECT_UNAUTH`
|
||||
* If `true`, the server certificate is verified against the list of supplied CAs.
|
||||
|
||||
- `OVERLEAF_LDAP_CACHE`
|
||||
* If `true`, then up to 100 credentials at a time will be cached for 5 minutes.
|
||||
|
||||
- `OVERLEAF_LDAP_TIMEOUT`
|
||||
* How long the client should let operations live for before timing out, ms (Default: Infinity).
|
||||
|
||||
- `OVERLEAF_LDAP_CONNECT_TIMEOUT`
|
||||
* How long the client should wait before timing out on TCP connections, ms (Default: OS default).
|
||||
|
||||
- `OVERLEAF_LDAP_IS_ADMIN_ATT` and `OVERLEAF_LDAP_IS_ADMIN_ATT_VALUE`
|
||||
* When both environment variables are set, the login process updates `user.isAdmin = true` if the LDAP profile contains the attribute specified by
|
||||
`OVERLEAF_LDAP_IS_ADMIN_ATT` and its value either matches `OVERLEAF_LDAP_IS_ADMIN_ATT_VALUE` or is an array containing `OVERLEAF_LDAP_IS_ADMIN_ATT_VALUE`,
|
||||
otherwise `user.isAdmin` is set to `false`. If either of these variables is not set, then the admin status is only set to `true` during admin user
|
||||
creation in Launchpad.
|
||||
|
||||
The following five variables are used to configure how user contacts are retrieved from the LDAP server.
|
||||
|
||||
- `OVERLEAF_LDAP_CONTACTS_FILTER`
|
||||
* The filter used to search for users in the LDAP server to be loaded into contacts. The placeholder '{{userProperty}}' within the filter is replaced with the value of
|
||||
the property specified by `OVERLEAF_LDAP_CONTACTS_PROPERTY` from the LDAP user initiating the search. If not defined, no users are retrieved from the LDAP server into contacts.
|
||||
|
||||
- `OVERLEAF_LDAP_CONTACTS_SEARCH_BASE`
|
||||
* Specifies the base DN from which to start searching for the contacts. Defaults to `OVERLEAF_LDAP_SEARCH_BASE`.
|
||||
|
||||
- `OVERLEAF_LDAP_CONTACTS_SEARCH_SCOPE`
|
||||
* The scope of the search can be `base`, `one`, or `sub` (default).
|
||||
|
||||
- `OVERLEAF_LDAP_CONTACTS_PROPERTY`
|
||||
* Specifies the property of the user object that will replace the '{{userProperty}}' placeholder in the `OVERLEAF_LDAP_CONTACTS_FILTER`.
|
||||
|
||||
- `OVERLEAF_LDAP_CONTACTS_NON_LDAP_VALUE`
|
||||
* Specifies the value of the `OVERLEAF_LDAP_CONTACTS_PROPERTY` if the search is initiated by a non-LDAP user. If this variable is not defined, the resulting filter
|
||||
will match nothing. The value `*` can be used as a wildcard.
|
||||
|
||||
<details>
|
||||
<summary><h5>Example</h5></summary>
|
||||
|
||||
OVERLEAF_LDAP_CONTACTS_FILTER=(gidNumber={{userProperty}})
|
||||
OVERLEAF_LDAP_CONTACTS_PROPERTY=gidNumber
|
||||
OVERLEAF_LDAP_CONTACTS_NON_LDAP_VALUE=1000
|
||||
|
||||
The above example results in loading into the contacts of the current LDAP user all LDAP users who have the same UNIX `gid`. Non-LDAP users will have all LDAP users with UNIX `gid=1000` in their contacts.
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary><h4>Sample variables.env file</h4></summary>
|
||||
|
||||
```
|
||||
OVERLEAF_APP_NAME="Our Overleaf Instance"
|
||||
|
||||
ENABLED_LINKED_FILE_TYPES=project_file,project_output_file
|
||||
|
||||
# Enables Thumbnail generation using ImageMagick
|
||||
ENABLE_CONVERSIONS=true
|
||||
|
||||
# Disables email confirmation requirement
|
||||
EMAIL_CONFIRMATION_DISABLED=true
|
||||
|
||||
## Nginx
|
||||
# NGINX_WORKER_PROCESSES=4
|
||||
# NGINX_WORKER_CONNECTIONS=768
|
||||
|
||||
## Set for TLS via nginx-proxy
|
||||
# OVERLEAF_BEHIND_PROXY=true
|
||||
# OVERLEAF_SECURE_COOKIE=true
|
||||
|
||||
OVERLEAF_SITE_URL=http://my-overleaf-instance.com
|
||||
OVERLEAF_NAV_TITLE=Our Overleaf Instance
|
||||
# OVERLEAF_HEADER_IMAGE_URL=http://somewhere.com/mylogo.png
|
||||
OVERLEAF_ADMIN_EMAIL=support@example.com
|
||||
|
||||
OVERLEAF_LEFT_FOOTER=[{"text": "Contact your support team", "url": "mailto:support@example.com"}]
|
||||
OVERLEAF_RIGHT_FOOTER=[{"text":"Hello, I am on the Right", "url":"https://github.com/yu-i-i/overleaf-cep"}]
|
||||
|
||||
OVERLEAF_EMAIL_FROM_ADDRESS=team@example.com
|
||||
OVERLEAF_EMAIL_SMTP_HOST=smtp.example.com
|
||||
OVERLEAF_EMAIL_SMTP_PORT=587
|
||||
OVERLEAF_EMAIL_SMTP_SECURE=false
|
||||
# OVERLEAF_EMAIL_SMTP_USER=
|
||||
# OVERLEAF_EMAIL_SMTP_PASS=
|
||||
# OVERLEAF_EMAIL_SMTP_NAME=
|
||||
OVERLEAF_EMAIL_SMTP_LOGGER=false
|
||||
OVERLEAF_EMAIL_SMTP_TLS_REJECT_UNAUTH=true
|
||||
OVERLEAF_EMAIL_SMTP_IGNORE_TLS=false
|
||||
OVERLEAF_CUSTOM_EMAIL_FOOTER=This system is run by department x
|
||||
|
||||
OVERLEAF_PROXY_LEARN=true
|
||||
NAV_HIDE_POWERED_BY=true
|
||||
|
||||
#################
|
||||
## LDAP for CE ##
|
||||
#################
|
||||
|
||||
EXTERNAL_AUTH=ldap
|
||||
OVERLEAF_LDAP_URL=ldap://ldap.example.com:389
|
||||
OVERLEAF_LDAP_STARTTLS=true
|
||||
OVERLEAF_LDAP_TLS_OPTS_CA_PATH=/overleaf/certs/ldap_ca_cert.pem
|
||||
OVERLEAF_LDAP_SEARCH_BASE=ou=people,dc=example,dc=com
|
||||
OVERLEAF_LDAP_SEARCH_FILTER=(|(uid={{username}})(mail={{username}}))
|
||||
OVERLEAF_LDAP_BIND_DN=cn=ldap_reader,dc=example,dc=com
|
||||
OVERLEAF_LDAP_BIND_CREDENTIALS=GoodNewsEveryone
|
||||
OVERLEAF_LDAP_EMAIL_ATT=mail
|
||||
OVERLEAF_LDAP_FIRST_NAME_ATT=givenName
|
||||
OVERLEAF_LDAP_LAST_NAME_ATT=sn
|
||||
# OVERLEAF_LDAP_NAME_ATT=cn
|
||||
OVERLEAF_LDAP_SEARCH_ATTRIBUTES=["uid", "sn", "givenName", "mail"]
|
||||
|
||||
OVERLEAF_LDAP_UPDATE_USER_DETAILS_ON_LOGIN=true
|
||||
|
||||
OVERLEAF_LDAP_PLACEHOLDER='Username or email address'
|
||||
|
||||
OVERLEAF_LDAP_IS_ADMIN_ATT=mail
|
||||
OVERLEAF_LDAP_IS_ADMIN_ATT_VALUE=admin@example.com
|
||||
|
||||
OVERLEAF_LDAP_CONTACTS_FILTER=(gidNumber={{userProperty}})
|
||||
OVERLEAF_LDAP_CONTACTS_PROPERTY=gidNumber
|
||||
OVERLEAF_LDAP_CONTACTS_NON_LDAP_VALUE='*'
|
||||
```
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><i>Deprecated variables</i></summary>
|
||||
|
||||
**These variables will be removed soon**, use `OVERLEAF_LDAP_IS_ADMIN_ATT` and `OVERLEAF_LDAP_IS_ADMIN_ATT_VALUE` instead.
|
||||
|
||||
The following variables are used to determine if the user has admin rights.
|
||||
Please note: the user gains admin status if the search result is not empty, not when the user is explicitly included in the search results.
|
||||
|
||||
- `OVERLEAF_LDAP_ADMIN_SEARCH_BASE`
|
||||
* Specifies the base DN from which to start searching for the admin group. If this variable is defined,
|
||||
`OVERLEAF_LDAP_ADMIN_SEARCH_FILTER` must also be defined for the search to function properly.
|
||||
|
||||
- `OVERLEAF_LDAP_ADMIN_SEARCH_FILTER`
|
||||
* Defines the LDAP search filter used to identify the admin group. The placeholder `{{dn}}` within the filter
|
||||
is replaced with the value of the property specified by `OVERLEAF_LDAP_ADMIN_DN_PROPERTY`. The placeholder `{{username}}` is also supported.
|
||||
|
||||
- `OVERLEAF_LDAP_ADMIN_DN_PROPERTY`
|
||||
* Specifies the property of the user object that will replace the '{{dn}}' placeholder
|
||||
in the `OVERLEAF_LDAP_ADMIN_SEARCH_FILTER`, defaults to `dn`.
|
||||
|
||||
- `OVERLEAF_LDAP_ADMIN_SEARCH_SCOPE`
|
||||
* The scope of the LDAP search can be `base`, `one`, or `sub` (default)
|
||||
|
||||
<details>
|
||||
<summary><h5>Example</h5></summary>
|
||||
|
||||
In the following example admins are members of a group `admins`, the objectClass of the entry `admins` is `groupOfNames`:
|
||||
|
||||
OVERLEAF_LDAP_ADMIN_SEARCH_BASE='cn=admins,ou=group,dc=example,dc=com'
|
||||
OVERLEAF_LDAP_ADMIN_SEARCH_FILTER='(member={{dn}})'
|
||||
|
||||
In the following example admins are members of a group 'admins', the objectClass of the entry `admins` is `posixGroup`:
|
||||
|
||||
OVERLEAF_LDAP_ADMIN_SEARCH_BASE='cn=admins,ou=group,dc=example,dc=com'
|
||||
OVERLEAF_LDAP_ADMIN_SEARCH_FILTER='(memberUid={{username}})'
|
||||
|
||||
In the following example admins are users with UNIX gid=1234:
|
||||
|
||||
OVERLEAF_LDAP_ADMIN_SEARCH_BASE='ou=people,dc=example,dc=com'
|
||||
OVERLEAF_LDAP_ADMIN_SEARCH_FILTER='(&(gidNumber=1234)(uid={{username}}))'
|
||||
|
||||
In the following example admin is the user with `uid=someuser`:
|
||||
|
||||
OVERLEAF_LDAP_ADMIN_SEARCH_BASE='ou=people,dc=example,dc=com'
|
||||
OVERLEAF_LDAP_ADMIN_SEARCH_FILTER='(&(uid=someuser)(uid={{username}}))'
|
||||
|
||||
The filter
|
||||
|
||||
OVERLEAF_LDAP_ADMIN_SEARCH_FILTER='(uid=someuser)'
|
||||
|
||||
where `someuser` is the uid of an existing user, will always produce a non-empty search result.
|
||||
As a result, **every user will be granted admin rights**, not just `someuser`, as one might expect.
|
||||
|
||||
</details>
|
||||
</details>
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><h3>SAML Authentication</h3></summary>
|
||||
|
||||
Internally, Overleaf SAML module uses the [passport-saml](https://github.com/node-saml/passport-saml) library, most of the following
|
||||
configuration options are passed through to `passport-saml`. If you are having issues configuring SAML, it is worth reading the README
|
||||
for `passport-saml` to get a feel for the configuration it expects.
|
||||
|
||||
When using the SAML authentication method, a user is redirected to the Identity Provider (IdP) authentication site.
|
||||
If the IdP successfully authenticates the user, the Overleaf users database is checked for a record containing a `samlIdentifiers` field structured as follows:
|
||||
|
||||
```
|
||||
samlIdentifiers: [
|
||||
{
|
||||
externalUserId: "...",
|
||||
providerId: "1",
|
||||
userIdAttribute: "..."
|
||||
}
|
||||
]
|
||||
```
|
||||
The `externalUserId` must match the value of the property specified by `userIdAttribute` in the user profile returned by the IdP server.
|
||||
|
||||
If no matching record is found, the database is searched for a user with the primary email address matching the email in the IdP user profile:
|
||||
|
||||
- If such a user is found, the `hashedPassword` field is deleted to disable local authentication, and the `samlIdentifiers` field is added.
|
||||
- If no matching user is found, a new user is created with the email address and `samlIdentifiers` from the IdP profile.
|
||||
|
||||
**Note:** Currently, only one SAML IdP is supported. The `providerId` field in `samlIdentifiers` is fixed to `'1'`.
|
||||
|
||||
#### Environment Variables
|
||||
|
||||
- `OVERLEAF_SAML_IDENTITY_SERVICE_NAME`
|
||||
* Display name for the identity service, used on the login page (default: `Log in with SAML IdP`).
|
||||
|
||||
- `OVERLEAF_SAML_USER_ID_FIELD`
|
||||
* The value of this attribute will be used by Overleaf as the external user ID, defaults to `nameID`.
|
||||
|
||||
- `OVERLEAF_SAML_EMAIL_FIELD`
|
||||
* Name of the Email field in user profile, defaults to `nameID`.
|
||||
|
||||
- `OVERLEAF_SAML_FIRST_NAME_FIELD`
|
||||
* Name of the firstName field in user profile, defaults to `givenName`.
|
||||
|
||||
- `OVERLEAF_SAML_LAST_NAME_FIELD`
|
||||
* Name of the lastName field in user profile, defaults to `lastName`
|
||||
|
||||
- `OVERLEAF_SAML_UPDATE_USER_DETAILS_ON_LOGIN`
|
||||
* If set to `true`, updates the user `first_name` and `last_name` field on login,
|
||||
and turn off the user details form on `/user/settings` page.
|
||||
|
||||
- `OVERLEAF_SAML_ENTRYPOINT` **(required)**
|
||||
* Entrypoint URL for the SAML identity service.
|
||||
|
||||
- Example: `https://idp.example.com/simplesaml/saml2/idp/SSOService.php`
|
||||
- Azure Example: `https://login.microsoftonline.com/8b26b46a-6dd3-45c7-a104-f883f4db1f6b/saml2`
|
||||
|
||||
- `OVERLEAF_SAML_ISSUER` **(required)**
|
||||
* The Issuer name.
|
||||
|
||||
- `OVERLEAF_SAML_AUDIENCE`
|
||||
* Expected saml response Audience, defaults to value of `OVERLEAF_SAML_ISSUER`.
|
||||
|
||||
- `OVERLEAF_SAML_IDP_CERT` **(required)**
|
||||
* Path to a file containing the Identity Provider's public certificate, used to validate the signatures of incoming SAML responses. If the Identity Provider has multiple valid signing certificates, then
|
||||
it can be a JSON array of paths to the certificates.
|
||||
|
||||
- Example (one certificate): `/overleaf/certs/idp_cert.pem`
|
||||
- Example (multiple certificates): `["/overleaf/certs/idp_cert.pem", "/overleaf/certs/idp_cert_old.pem"]`
|
||||
|
||||
- `OVERLEAF_SAML_PUBLIC_CERT`
|
||||
* Path to a file containing public signing certificate used to embed in auth requests in order for the IdP to validate the signatures of the incoming SAML Request. It's required when setting up the [metadata endpoint](#metadata-for-the-identity-provider)
|
||||
when the strategy is configured with a `OVERLEAF_SAML_PRIVATE_KEY`. A JSON array of paths to certificates can be provided to support certificate rotation. When supplying an array of certificates, the first entry in the array should match the
|
||||
current `OVERLEAF_SAML_PRIVATE_KEY`. Additional entries in the array can be used to publish upcoming certificates to IdPs before changing the `OVERLEAF_SAML_PRIVATE_KEY`.
|
||||
|
||||
- `OVERLEAF_SAML_PRIVATE_KEY`
|
||||
* Path to a file containing a PEM-formatted private key matching the `OVERLEAF_SAML_PUBLIC_CERT` used to sign auth requests sent by passport-saml.
|
||||
|
||||
- `OVERLEAF_SAML_DECRYPTION_CERT`
|
||||
* Path to a file containing public certificate, used for the [metadata endpoint](#metadata-for-the-identity-provider).
|
||||
|
||||
- `OVERLEAF_SAML_DECRYPTION_PVK`
|
||||
* Path to a file containing private key matching the `OVERLEAF_SAML_DECRYPTION_CERT` that will be used to attempt to decrypt any encrypted assertions that are received.
|
||||
|
||||
- `OVERLEAF_SAML_SIGNATURE_ALGORITHM`
|
||||
* Optionally set the signature algorithm for signing requests,
|
||||
valid values are 'sha1' (default), 'sha256' (prefered), 'sha512' (most secure, check if your IdP supports it).
|
||||
|
||||
- `OVERLEAF_SAML_ADDITIONAL_PARAMS`
|
||||
* JSON dictionary of additional query params to add to all requests.
|
||||
|
||||
- `OVERLEAF_SAML_ADDITIONAL_AUTHORIZE_PARAMS`
|
||||
* JSON dictionary of additional query params to add to 'authorize' requests.
|
||||
|
||||
- Example: `{"some_key": "some_value"}`
|
||||
|
||||
- `OVERLEAF_SAML_IDENTIFIER_FORMAT`
|
||||
* Name identifier format to request from the identity provider (default: `urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress`).
|
||||
If using `urn:oasis:names:tc:SAML:2.0:nameid-format:persistent`, ensure the `OVERLEAF_SAML_EMAIL_FIELD` envirionment variable is defined.
|
||||
If `urn:oasis:names:tc:SAML:2.0:nameid-format:transient` is required, you must also define the `OVERLEAF_SAML_ID_FIELD` environment variable,
|
||||
which can, for example, be set to the user's email address.
|
||||
|
||||
- `OVERLEAF_SAML_ACCEPTED_CLOCK_SKEW_MS`
|
||||
* Time in milliseconds of skew that is acceptable between client and server when checking OnBefore and NotOnOrAfter assertion
|
||||
condition validity timestamps. Setting to -1 will disable checking these conditions entirely. Default is 0.
|
||||
|
||||
- `OVERLEAF_SAML_ATTRIBUTE_CONSUMING_SERVICE_INDEX`
|
||||
* `AttributeConsumingServiceIndex` attribute to add to AuthnRequest to instruct the IdP which attribute set to attach
|
||||
to the response ([link](http://blog.aniljohn.com/2014/01/data-minimization-front-channel-saml-attribute-requests.html)).
|
||||
|
||||
- `OVERLEAF_SAML_AUTHN_CONTEXT`
|
||||
* JSON array of name identifier format values to request auth context. Default: `["urn:oasis:names:tc:SAML:2.0:ac:classes:PasswordProtectedTransport"]`.
|
||||
|
||||
- `OVERLEAF_SAML_FORCE_AUTHN`
|
||||
* If `true`, the initial SAML request from the service provider specifies that the IdP should force re-authentication of the user,
|
||||
even if they possess a valid session.
|
||||
|
||||
- `OVERLEAF_SAML_DISABLE_REQUESTED_AUTHN_CONTEXT`
|
||||
* If `true`, do not request a specific auth context. For example, you can this this to `true` to allow additional contexts such as password-less logins (`urn:oasis:names:tc:SAML:2.0:ac:classes:X509`). Support for additional contexts is dependant on your IdP.
|
||||
|
||||
- `OVERLEAF_SAML_AUTHN_REQUEST_BINDING`
|
||||
* If set to `HTTP-POST`, will request authentication from IdP via HTTP POST binding, otherwise defaults to HTTP-Redirect.
|
||||
|
||||
- `OVERLEAF_SAML_VALIDATE_IN_RESPONSE_TO`
|
||||
* If `always`, then InResponseTo will be validated from incoming SAML responses.
|
||||
* If `never`, then InResponseTo won't be validated (default).
|
||||
* If `ifPresent`, then InResponseTo will only be validated if present in the incoming SAML response.
|
||||
|
||||
- `OVERLEAF_SAML_REQUEST_ID_EXPIRATION_PERIOD_MS`
|
||||
* Defines the expiration time when a Request ID generated for a SAML request will not be valid if seen
|
||||
in a SAML response in the `InResponseTo` field. Default: 28800000 (8 hours).
|
||||
|
||||
- `OVERLEAF_SAML_LOGOUT_URL`
|
||||
* base address to call with logout requests (default: `entryPoint`).
|
||||
|
||||
- Example: `https://idp.example.com/simplesaml/saml2/idp/SingleLogoutService.php`
|
||||
|
||||
- `OVERLEAF_SAML_ADDITIONAL_LOGOUT_PARAMS`
|
||||
* JSON dictionary of additional query params to add to 'logout' requests.
|
||||
|
||||
- `OVERLEAF_SAML_IS_ADMIN_FIELD` and `OVERLEAF_SAML_IS_ADMIN_FIELD_VALUE`
|
||||
* When both environment variables are set, the login process updates `user.isAdmin = true` if the profile returned by the SAML IdP contains the attribute specified by
|
||||
`OVERLEAF_SAML_IS_ADMIN_FIELD` and its value either matches `OVERLEAF_SAML_IS_ADMIN_FIELD_VALUE` or is an array containing `OVERLEAF_SAML_IS_ADMIN_FIELD_VALUE`,
|
||||
otherwise `user.isAdmin` is set to `false`. If either of these variables is not set, then the admin status is only set to `true` during admin user
|
||||
creation in Launchpad.
|
||||
|
||||
#### Metadata for the Identity Provider
|
||||
|
||||
The current version of Overleaf CE includes and endpoint to retrieve Service Provider Metadata: `http://my-overleaf-instance.com/saml/meta`
|
||||
|
||||
The Identity Provider will need to be configured to recognize the Overleaf server as a "Service Provider". Consult the documentation for your SAML server for instructions on how to do this.
|
||||
|
||||
Below is an example of appropriate Service Provider metadata:
|
||||
|
||||
<details>
|
||||
<summary><h5>ol-meta.xml</h5></summary>
|
||||
|
||||
```
|
||||
<?xml version="1.0"?>
|
||||
<EntityDescriptor xmlns="urn:oasis:names:tc:SAML:2.0:metadata"
|
||||
xmlns:ds="http://www.w3.org/2000/09/xmldsig#"
|
||||
entityID="MyOverleaf"
|
||||
ID="_b508c83b7dda452f5b269383fb391107116f8f57">
|
||||
<SPSSODescriptor protocolSupportEnumeration="urn:oasis:names:tc:SAML:2.0:protocol" AuthnRequestsSigned="true" WantAssertionsSigned="true">
|
||||
<KeyDescriptor use="signing">
|
||||
<ds:KeyInfo>
|
||||
<ds:X509Data>
|
||||
<ds:X509Certificate>MII...
|
||||
[skipped]
|
||||
</ds:X509Certificate>
|
||||
</ds:X509Data>
|
||||
</ds:KeyInfo>
|
||||
</KeyDescriptor>
|
||||
<KeyDescriptor use="encryption">
|
||||
<ds:KeyInfo>
|
||||
<ds:X509Data>
|
||||
<ds:X509Certificate>MII...
|
||||
[skipped]
|
||||
</ds:X509Certificate>
|
||||
</ds:X509Data>
|
||||
</ds:KeyInfo>
|
||||
<EncryptionMethod Algorithm="http://www.w3.org/2009/xmlenc11#aes256-gcm"/>
|
||||
<EncryptionMethod Algorithm="http://www.w3.org/2009/xmlenc11#aes128-gcm"/>
|
||||
<EncryptionMethod Algorithm="http://www.w3.org/2001/04/xmlenc#aes256-cbc"/>
|
||||
<EncryptionMethod Algorithm="http://www.w3.org/2001/04/xmlenc#aes128-cbc"/>
|
||||
</KeyDescriptor>
|
||||
<SingleLogoutService Binding="urn:oasis:names:tc:SAML:2.0:bindings:HTTP-POST"
|
||||
Location="https://my-overleaf-instance.com/saml/logout/callback"/>
|
||||
<NameIDFormat>urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress</NameIDFormat>
|
||||
<AssertionConsumerService index="1"
|
||||
isDefault="true"
|
||||
Binding="urn:oasis:names:tc:SAML:2.0:bindings:HTTP-POST"
|
||||
Location="https://my-overleaf-instance.com/saml/login/callback"/>
|
||||
</SPSSODescriptor>
|
||||
</EntityDescriptor>
|
||||
|
||||
```
|
||||
</details>
|
||||
|
||||
Note the certificates, `AssertionConsumerService.Location`, `SingleLogoutService.Location` and `EntityDescriptor.entityID`
|
||||
and set as appropriate in your IdP configuration, or send the metadata file to the IdP admin.
|
||||
|
||||
<details>
|
||||
<summary><h4>Sample variables.env file</h4></summary>
|
||||
|
||||
```
|
||||
OVERLEAF_APP_NAME="Our Overleaf Instance"
|
||||
|
||||
ENABLED_LINKED_FILE_TYPES=project_file,project_output_file
|
||||
|
||||
# Enables Thumbnail generation using ImageMagick
|
||||
ENABLE_CONVERSIONS=true
|
||||
|
||||
# Disables email confirmation requirement
|
||||
EMAIL_CONFIRMATION_DISABLED=true
|
||||
|
||||
## Nginx
|
||||
# NGINX_WORKER_PROCESSES=4
|
||||
# NGINX_WORKER_CONNECTIONS=768
|
||||
|
||||
## Set for TLS via nginx-proxy
|
||||
# OVERLEAF_BEHIND_PROXY=true
|
||||
# OVERLEAF_SECURE_COOKIE=true
|
||||
|
||||
OVERLEAF_SITE_URL=http://my-overleaf-instance.com
|
||||
OVERLEAF_NAV_TITLE=Our Overleaf Instance
|
||||
# OVERLEAF_HEADER_IMAGE_URL=http://somewhere.com/mylogo.png
|
||||
OVERLEAF_ADMIN_EMAIL=support@example.com
|
||||
|
||||
OVERLEAF_LEFT_FOOTER=[{"text": "Contact your support team", "url": "mailto:support@example.com"}]
|
||||
OVERLEAF_RIGHT_FOOTER=[{"text":"Hello, I am on the Right", "url":"https://github.com/yu-i-i/overleaf-cep"}]
|
||||
|
||||
OVERLEAF_EMAIL_FROM_ADDRESS=team@example.com
|
||||
OVERLEAF_EMAIL_SMTP_HOST=smtp.example.com
|
||||
OVERLEAF_EMAIL_SMTP_PORT=587
|
||||
OVERLEAF_EMAIL_SMTP_SECURE=false
|
||||
# OVERLEAF_EMAIL_SMTP_USER=
|
||||
# OVERLEAF_EMAIL_SMTP_PASS=
|
||||
# OVERLEAF_EMAIL_SMTP_NAME=
|
||||
OVERLEAF_EMAIL_SMTP_LOGGER=false
|
||||
OVERLEAF_EMAIL_SMTP_TLS_REJECT_UNAUTH=true
|
||||
OVERLEAF_EMAIL_SMTP_IGNORE_TLS=false
|
||||
OVERLEAF_CUSTOM_EMAIL_FOOTER=This system is run by department x
|
||||
|
||||
OVERLEAF_PROXY_LEARN=true
|
||||
NAV_HIDE_POWERED_BY=true
|
||||
|
||||
#################
|
||||
## SAML for CE ##
|
||||
#################
|
||||
|
||||
EXTERNAL_AUTH=saml
|
||||
OVERLEAF_SAML_IDENTITY_SERVICE_NAME='Log in with My IdP'
|
||||
OVERLEAF_SAML_EMAIL_FIELD=mail
|
||||
OVERLEAF_SAML_FIRST_NAME_FIELD=givenName
|
||||
OVERLEAF_SAML_LAST_NAME_FIELD=sn
|
||||
OVERLEAF_SAML_ENTRYPOINT=https://idp.example.com/simplesamlphp/saml2/idp/SSOService.php
|
||||
OVERLEAF_SAML_CALLBACK_URL=https://my-overleaf-instance.com/saml/login/callback
|
||||
OVERLEAF_SAML_LOGOUT_URL=https://idp.example.com/simplesamlphp/saml2/idp/SingleLogoutService.php
|
||||
OVERLEAF_SAML_LOGOUT_CALLBACK_URL=https://my-overleaf-instance.com/saml/logout/callback
|
||||
OVERLEAF_SAML_ISSUER=MyOverleaf
|
||||
OVERLEAF_SAML_IDP_CERT=/overleaf/certs/idp_cert.pem
|
||||
OVERLEAF_SAML_PUBLIC_CERT=/overleaf/certs/myol_cert.pem
|
||||
OVERLEAF_SAML_PRIVATE_KEY=/overleaf/certs/myol_key.pem
|
||||
OVERLEAF_SAML_DECRYPTION_CERT=/overleaf/certs/myol_decr_cert.pem
|
||||
OVERLEAF_SAML_DECRYPTION_PVK=/overleaf/certs/myol_decr_key.pem
|
||||
OVERLEAF_SAML_IS_ADMIN_FIELD=mail
|
||||
OVERLEAF_SAML_IS_ADMIN_FIELD_VALUE=overleaf.admin@example.com
|
||||
```
|
||||
</details>
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><h3>OIDC Authentication</h3></summary>
|
||||
|
||||
Internally, Overleaf OIDC module uses the [passport-openidconnect](https://github.com/jaredhanson/passport-openidconnect) library.
|
||||
If you are having issues configuring OpenID Connect, it is worth reading the README for `passport-openidconnect` to get a feel for the configuration it expects.
|
||||
|
||||
|
||||
When using the OIDC authentication method, a user is redirected to the Identity Provider (IdP) authentication site.
|
||||
If the IdP successfully authenticates the user, the Overleaf users database is checked for a record containing a `thirdPartyIdentifiers` field structured as follows:
|
||||
|
||||
```
|
||||
thirdPartyIdentifiers: [
|
||||
{
|
||||
externalUserId: "...",
|
||||
externalData: null,
|
||||
providerId: "..."
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
The `externalUserId` must match the user ID in the profile returned by the IdP server (see the `OVERLEAF_OIDC_USER_ID_FIELD` environment variable), and `providerId`
|
||||
must match the ID of the OIDC provider (see the `OVERLEAF_OIDC_PROVIDER_ID`).
|
||||
|
||||
If no matching record is found, the database is searched for a user with the primary email address matching the email in the IdP user profile:
|
||||
- If such a user is found, the `thirdPartyIdentifiers` field is updated.
|
||||
- If no matching user is found, a new user is created with the email address and `thirdPartyIdentifiers` from the IdP profile.
|
||||
|
||||
In both cases, the user is said to be 'linked' to the external OIDC user. The user can be unlinked from the OIDC provider on the `/user/settings` page.
|
||||
|
||||
|
||||
#### Environment Variables
|
||||
|
||||
The values of the following five required variables can be found using `.well-known/openid-configuration` endpoint of your OpenID Provider (OP).
|
||||
|
||||
- `OVERLEAF_OIDC_ISSUER` **(required)**
|
||||
|
||||
- `OVERLEAF_OIDC_AUTHORIZATION_URL` **(required)**
|
||||
|
||||
- `OVERLEAF_OIDC_TOKEN_URL` **(required)**
|
||||
|
||||
- `OVERLEAF_OIDC_USER_INFO_URL` **(required)**
|
||||
|
||||
- `OVERLEAF_OIDC_LOGOUT_URL` **(required)**
|
||||
|
||||
The values of the following two required variables will be provided by the admin of your OP
|
||||
|
||||
- `OVERLEAF_OIDC_CLIENT_ID` **(required)**
|
||||
|
||||
- `OVERLEAF_OIDC_CLIENT_SECRET` **(required)**
|
||||
|
||||
- `OVERLEAF_OIDC_SCOPE`
|
||||
* Default: `openid profile email`
|
||||
|
||||
- `OVERLEAF_OIDC_PROVIDER_ID`
|
||||
* Arbitrary ID of the OP, defaults to `oidc`.
|
||||
|
||||
- `OVERLEAF_OIDC_PROVIDER_NAME`
|
||||
* The name of the OP, used in the `Linked Accounts` section of the `/user/settings` page, defaults to `OIDC Provider`.
|
||||
|
||||
- `OVERLEAF_OIDC_IDENTITY_SERVICE_NAME`
|
||||
* Display name for the identity service, used on the login page (default: `Log in with $OVERLEAF_OIDC_PROVIDER_NAME`).
|
||||
|
||||
- `OVERLEAF_OIDC_PROVIDER_DESCRIPTION`
|
||||
* Description of OP, used in the `Linked Accounts` section (default: `Log in with $OVERLEAF_OIDC_PROVIDER_NAME`).
|
||||
|
||||
- `OVERLEAF_OIDC_PROVIDER_INFO_LINK`
|
||||
* `Learn more` URL in the OP description, default: no `Learn more` link in the description.
|
||||
|
||||
- `OVERLEAF_OIDC_PROVIDER_HIDE_NOT_LINKED`
|
||||
* Do not show OP on the `/user/settings` page, if the user's account is not linked with the OP, default `false`.
|
||||
|
||||
- `OVERLEAF_OIDC_USER_ID_FIELD`
|
||||
* The value of this attribute will be used by Overleaf as the external user ID, defaults to `id`.
|
||||
|
||||
- `OVERLEAF_OIDC_UPDATE_USER_DETAILS_ON_LOGIN`
|
||||
* If set to `true`, updates the user `first_name` and `last_name` field on login,
|
||||
and disables the user details form on `/user/settings` page.
|
||||
|
||||
- `OVERLEAF_OIDC_IS_ADMIN_FIELD` and `OVERLEAF_OIDC_IS_ADMIN_FIELD_VALUE`
|
||||
* When both environment variables are set, the login process updates `user.isAdmin = true` if the profile returned by the OP contains the attribute specified by
|
||||
`OVERLEAF_OIDC_IS_ADMIN_FIELD` and its value matches `OVERLEAF_OIDC_IS_ADMIN_FIELD_VALUE`, otherwise `user.isAdmin` is set to `false`.
|
||||
If `OVERLEAF_OIDC_IS_ADMIN_FIELD` is `email` then the value of the attribute `emails[0].value` is used for match checking.
|
||||
|
||||
<details>
|
||||
<summary><h4>Sample variables.env file</h4></summary>
|
||||
|
||||
```
|
||||
OVERLEAF_APP_NAME="Our Overleaf Instance"
|
||||
|
||||
ENABLED_LINKED_FILE_TYPES=project_file,project_output_file
|
||||
|
||||
# Enables Thumbnail generation using ImageMagick
|
||||
ENABLE_CONVERSIONS=true
|
||||
|
||||
# Disables email confirmation requirement
|
||||
EMAIL_CONFIRMATION_DISABLED=true
|
||||
|
||||
## Nginx
|
||||
# NGINX_WORKER_PROCESSES=4
|
||||
# NGINX_WORKER_CONNECTIONS=768
|
||||
|
||||
## Set for TLS via nginx-proxy
|
||||
# OVERLEAF_BEHIND_PROXY=true
|
||||
# OVERLEAF_SECURE_COOKIE=true
|
||||
|
||||
OVERLEAF_SITE_URL=http://my-overleaf-instance.com
|
||||
OVERLEAF_NAV_TITLE=Our Overleaf Instance
|
||||
# OVERLEAF_HEADER_IMAGE_URL=http://somewhere.com/mylogo.png
|
||||
OVERLEAF_ADMIN_EMAIL=support@example.com
|
||||
|
||||
OVERLEAF_LEFT_FOOTER=[{"text": "Contact your support team", "url": "mailto:support@example.com"}]
|
||||
OVERLEAF_RIGHT_FOOTER=[{"text":"Hello, I am on the Right", "url":"https://github.com/yu-i-i/overleaf-cep"}]
|
||||
|
||||
OVERLEAF_EMAIL_FROM_ADDRESS=team@example.com
|
||||
OVERLEAF_EMAIL_SMTP_HOST=smtp.example.com
|
||||
OVERLEAF_EMAIL_SMTP_PORT=587
|
||||
OVERLEAF_EMAIL_SMTP_SECURE=false
|
||||
# OVERLEAF_EMAIL_SMTP_USER=
|
||||
# OVERLEAF_EMAIL_SMTP_PASS=
|
||||
# OVERLEAF_EMAIL_SMTP_NAME=
|
||||
OVERLEAF_EMAIL_SMTP_LOGGER=false
|
||||
OVERLEAF_EMAIL_SMTP_TLS_REJECT_UNAUTH=true
|
||||
OVERLEAF_EMAIL_SMTP_IGNORE_TLS=false
|
||||
OVERLEAF_CUSTOM_EMAIL_FOOTER=This system is run by department x
|
||||
|
||||
OVERLEAF_PROXY_LEARN=true
|
||||
NAV_HIDE_POWERED_BY=true
|
||||
|
||||
#################
|
||||
## OIDC for CE ##
|
||||
#################
|
||||
|
||||
EXTERNAL_AUTH=oidc
|
||||
|
||||
OVERLEAF_OIDC_PROVIDER_ID=oidc
|
||||
OVERLEAF_OIDC_ISSUER=https://keycloak.provider.com/realms/example
|
||||
OVERLEAF_OIDC_AUTHORIZATION_URL=https://keycloak.provider.com/realms/example/protocol/openid-connect/auth
|
||||
OVERLEAF_OIDC_TOKEN_URL=https://keycloak.provider.com/realms/example/protocol/openid-connect/token
|
||||
OVERLEAF_OIDC_USER_INFO_URL=https://keycloak.provider.com/realms/example/protocol/openid-connect/userinfo
|
||||
OVERLEAF_OIDC_LOGOUT_URL=https://keycloak.provider.com/realms/example/protocol/openid-connect/logout
|
||||
OVERLEAF_OIDC_CLIENT_ID=Overleaf-OIDC
|
||||
OVERLEAF_OIDC_CLIENT_SECRET=DoNotUseThisATGgaAcTgCcATgGATTACAagGtTCaGcGTAG
|
||||
OVERLEAF_OIDC_IDENTITY_SERVICE_NAME='Log in with Keycloak OIDC Provider'
|
||||
OVERLEAF_OIDC_PROVIDER_NAME=OIDC Keycloak Provider
|
||||
OVERLEAF_OIDC_PROVIDER_INFO_LINK=https://openid.net
|
||||
OVERLEAF_OIDC_IS_ADMIN_FIELD=email
|
||||
OVERLEAF_OIDC_IS_ADMIN_FIELD_VALUE=overleaf.admin@example.com
|
||||
OVERLEAF_OIDC_UPDATE_USER_DETAILS_ON_LOGIN=false
|
||||
```
|
||||
</details>
|
||||
</details>
|
||||
|
||||
## Overleaf Docker Image
|
||||
|
||||
This repo contains two dockerfiles, [`Dockerfile-base`](server-ce/Dockerfile-base), which builds the
|
||||
`sharelatex/sharelatex-base:ext-ce` image, and [`Dockerfile`](server-ce/Dockerfile) which builds the
|
||||
`sharelatex/sharelatex:ext-ce` image.
|
||||
`sharelatex/sharelatex-base` image, and [`Dockerfile`](server-ce/Dockerfile) which builds the
|
||||
`sharelatex/sharelatex` (or "community") image.
|
||||
|
||||
The Base image generally contains the basic dependencies like `wget`, plus `texlive`.
|
||||
This is split out because it's a pretty heavy set of
|
||||
We split this out because it's a pretty heavy set of
|
||||
dependencies, and it's nice to not have to rebuild all of that every time.
|
||||
|
||||
The `sharelatex/sharelatex` image extends the base image and adds the actual Overleaf code
|
||||
|
@ -67,19 +890,20 @@ and services.
|
|||
|
||||
Use `make build-base` and `make build-community` from `server-ce/` to build these images.
|
||||
|
||||
The [Phusion base-image](https://github.com/phusion/baseimage-docker)
|
||||
(which is extended by the `base` image) provides a VM-like container
|
||||
We use the [Phusion base-image](https://github.com/phusion/baseimage-docker)
|
||||
(which is extended by our `base` image) to provide us with a VM-like container
|
||||
in which to run the Overleaf services. Baseimage uses the `runit` service
|
||||
manager to manage services, and init scripts from the `server-ce/runit`
|
||||
folder are added.
|
||||
manager to manage services, and we add our init-scripts from the `server-ce/runit`
|
||||
folder.
|
||||
|
||||
## Authors
|
||||
|
||||
[The Overleaf Team](https://www.overleaf.com/about)
|
||||
[yu-i-i](https://github.com/yu-i-i/overleaf-cep) — Extensions for CE unless otherwise noted
|
||||
[The Overleaf Team](https://www.overleaf.com/about)
|
||||
<br>
|
||||
Extensions for CE by: [yu-i-i](https://github.com/yu-i-i/overleaf-cep)
|
||||
|
||||
## License
|
||||
|
||||
The code in this repository is released under the GNU AFFERO GENERAL PUBLIC LICENSE, version 3. A copy can be found in the [`LICENSE`](LICENSE) file.
|
||||
|
||||
Copyright (c) Overleaf, 2014-2025.
|
||||
Copyright (c) Overleaf, 2014-2024.
|
||||
|
|
|
@ -11,6 +11,12 @@ bin/build
|
|||
> [!NOTE]
|
||||
> If Docker is running out of RAM while building the services in parallel, create a `.env` file in this directory containing `COMPOSE_PARALLEL_LIMIT=1`.
|
||||
|
||||
Next, initialize the database:
|
||||
|
||||
```shell
|
||||
bin/init
|
||||
```
|
||||
|
||||
Then start the services:
|
||||
|
||||
```shell
|
||||
|
@ -42,7 +48,7 @@ To do this, use the included `bin/dev` script:
|
|||
bin/dev
|
||||
```
|
||||
|
||||
This will start all services using `node --watch`, which will automatically monitor the code and restart the services as necessary.
|
||||
This will start all services using `nodemon`, which will automatically monitor the code and restart the services as necessary.
|
||||
|
||||
To improve performance, you can start only a subset of the services in development mode by providing a space-separated list to the `bin/dev` script:
|
||||
|
||||
|
@ -77,7 +83,6 @@ each service:
|
|||
| `filestore` | 9235 |
|
||||
| `notifications` | 9236 |
|
||||
| `real-time` | 9237 |
|
||||
| `references` | 9238 |
|
||||
| `history-v1` | 9239 |
|
||||
| `project-history` | 9240 |
|
||||
|
||||
|
|
6
develop/bin/init
Executable file
6
develop/bin/init
Executable file
|
@ -0,0 +1,6 @@
|
|||
#!/usr/bin/env bash
|
||||
|
||||
docker compose up --detach mongo
|
||||
curl --max-time 10 --retry 5 --retry-delay 5 --retry-all-errors --silent --output /dev/null localhost:27017
|
||||
docker compose exec mongo mongosh --eval "rs.initiate({ _id: 'overleaf', members: [{ _id: 0, host: 'mongo:27017' }] })"
|
||||
docker compose down mongo
|
|
@ -6,18 +6,15 @@ DOCUMENT_UPDATER_HOST=document-updater
|
|||
FILESTORE_HOST=filestore
|
||||
GRACEFUL_SHUTDOWN_DELAY_SECONDS=0
|
||||
HISTORY_V1_HOST=history-v1
|
||||
HISTORY_REDIS_HOST=redis
|
||||
LISTEN_ADDRESS=0.0.0.0
|
||||
MONGO_HOST=mongo
|
||||
MONGO_URL=mongodb://mongo/sharelatex?directConnection=true
|
||||
NOTIFICATIONS_HOST=notifications
|
||||
PROJECT_HISTORY_HOST=project-history
|
||||
QUEUES_REDIS_HOST=redis
|
||||
REALTIME_HOST=real-time
|
||||
REDIS_HOST=redis
|
||||
REFERENCES_HOST=references
|
||||
SESSION_SECRET=foo
|
||||
V1_HISTORY_HOST=history-v1
|
||||
WEBPACK_HOST=webpack
|
||||
WEB_API_PASSWORD=overleaf
|
||||
WEB_API_USER=overleaf
|
||||
|
|
|
@ -117,14 +117,14 @@ services:
|
|||
environment:
|
||||
- NODE_OPTIONS=--inspect=0.0.0.0:9229
|
||||
ports:
|
||||
- "127.0.0.1:9238:9229"
|
||||
- "127.0.0.1:9236:9229"
|
||||
volumes:
|
||||
- ../services/references/app:/overleaf/services/references/app
|
||||
- ../services/references/config:/overleaf/services/references/config
|
||||
- ../services/references/app.js:/overleaf/services/references/app.js
|
||||
|
||||
web:
|
||||
command: ["node", "--watch", "app.mjs", "--watch-locales"]
|
||||
command: ["node", "--watch", "app.js", "--watch-locales"]
|
||||
environment:
|
||||
- NODE_OPTIONS=--inspect=0.0.0.0:9229
|
||||
ports:
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
volumes:
|
||||
clsi-cache:
|
||||
clsi-output:
|
||||
filestore-public-files:
|
||||
filestore-template-files:
|
||||
filestore-uploads:
|
||||
|
@ -25,16 +26,15 @@ services:
|
|||
env_file:
|
||||
- dev.env
|
||||
environment:
|
||||
- DOCKER_RUNNER=true
|
||||
- TEXLIVE_IMAGE=texlive-full # docker build texlive -t texlive-full
|
||||
- SANDBOXED_COMPILES=true
|
||||
- SANDBOXED_COMPILES_HOST_DIR_COMPILES=${PWD}/compiles
|
||||
- SANDBOXED_COMPILES_HOST_DIR_OUTPUT=${PWD}/output
|
||||
- COMPILES_HOST_DIR=${PWD}/compiles
|
||||
user: root
|
||||
volumes:
|
||||
- ${PWD}/compiles:/overleaf/services/clsi/compiles
|
||||
- ${PWD}/output:/overleaf/services/clsi/output
|
||||
- ${DOCKER_SOCKET_PATH:-/var/run/docker.sock}:/var/run/docker.sock
|
||||
- clsi-cache:/overleaf/services/clsi/cache
|
||||
- clsi-output:/overleaf/services/clsi/output
|
||||
|
||||
contacts:
|
||||
build:
|
||||
|
@ -88,20 +88,12 @@ services:
|
|||
- history-v1-buckets:/buckets
|
||||
|
||||
mongo:
|
||||
image: mongo:6.0
|
||||
image: mongo:5
|
||||
command: --replSet overleaf
|
||||
ports:
|
||||
- "127.0.0.1:27017:27017" # for debugging
|
||||
volumes:
|
||||
- mongo-data:/data/db
|
||||
- ../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
environment:
|
||||
MONGO_INITDB_DATABASE: sharelatex
|
||||
extra_hosts:
|
||||
# Required when using the automatic database setup for initializing the
|
||||
# replica set. This override is not needed when running the setup after
|
||||
# starting up mongo.
|
||||
- mongo:127.0.0.1
|
||||
|
||||
notifications:
|
||||
build:
|
||||
|
@ -123,7 +115,7 @@ services:
|
|||
dockerfile: services/real-time/Dockerfile
|
||||
env_file:
|
||||
- dev.env
|
||||
|
||||
|
||||
redis:
|
||||
image: redis:5
|
||||
ports:
|
||||
|
@ -147,7 +139,7 @@ services:
|
|||
- dev.env
|
||||
environment:
|
||||
- APP_NAME=Overleaf Community Edition
|
||||
- ENABLED_LINKED_FILE_TYPES=project_file,project_output_file,url
|
||||
- ENABLED_LINKED_FILE_TYPES=project_file,project_output_file
|
||||
- EMAIL_CONFIRMATION_DISABLED=true
|
||||
- NODE_ENV=development
|
||||
- OVERLEAF_ALLOW_PUBLIC_ACCESS=true
|
||||
|
|
BIN
doc/logo.png
BIN
doc/logo.png
Binary file not shown.
Before Width: | Height: | Size: 13 KiB After Width: | Height: | Size: 71 KiB |
|
@ -32,7 +32,7 @@ services:
|
|||
OVERLEAF_REDIS_HOST: redis
|
||||
REDIS_HOST: redis
|
||||
|
||||
ENABLED_LINKED_FILE_TYPES: 'project_file,project_output_file,url'
|
||||
ENABLED_LINKED_FILE_TYPES: 'project_file,project_output_file'
|
||||
|
||||
# Enables Thumbnail generation using ImageMagick
|
||||
ENABLE_CONVERSIONS: 'true'
|
||||
|
@ -40,6 +40,10 @@ services:
|
|||
# Disables email confirmation requirement
|
||||
EMAIL_CONFIRMATION_DISABLED: 'true'
|
||||
|
||||
# temporary fix for LuaLaTex compiles
|
||||
# see https://github.com/overleaf/overleaf/issues/695
|
||||
TEXMFVAR: /var/lib/overleaf/tmp/texmf-var
|
||||
|
||||
## Set for SSL via nginx-proxy
|
||||
#VIRTUAL_HOST: 103.112.212.22
|
||||
|
||||
|
@ -73,19 +77,11 @@ services:
|
|||
## Server Pro ##
|
||||
################
|
||||
|
||||
## The Community Edition is intended for use in environments where all users are trusted and is not appropriate for
|
||||
## scenarios where isolation of users is required. Sandboxed Compiles are not available in the Community Edition,
|
||||
## so the following environment variables must be commented out to avoid compile issues.
|
||||
##
|
||||
## Sandboxed Compiles: https://docs.overleaf.com/on-premises/configuration/overleaf-toolkit/server-pro-only-configuration/sandboxed-compiles
|
||||
## Sandboxed Compiles: https://github.com/overleaf/overleaf/wiki/Server-Pro:-Sandboxed-Compiles
|
||||
SANDBOXED_COMPILES: 'true'
|
||||
### Bind-mount source for /var/lib/overleaf/data/compiles inside the container.
|
||||
SANDBOXED_COMPILES_HOST_DIR_COMPILES: '/home/user/sharelatex_data/data/compiles'
|
||||
### Bind-mount source for /var/lib/overleaf/data/output inside the container.
|
||||
SANDBOXED_COMPILES_HOST_DIR_OUTPUT: '/home/user/sharelatex_data/data/output'
|
||||
### Backwards compatibility (before Server Pro 5.5)
|
||||
DOCKER_RUNNER: 'true'
|
||||
SANDBOXED_COMPILES_SIBLING_CONTAINERS: 'true'
|
||||
### Bind-mount source for /var/lib/overleaf/data/compiles inside the container.
|
||||
SANDBOXED_COMPILES_HOST_DIR: '/home/user/sharelatex_data/data/compiles'
|
||||
|
||||
## Works with test LDAP server shown at bottom of docker compose
|
||||
# OVERLEAF_LDAP_URL: 'ldap://ldap:389'
|
||||
|
@ -106,12 +102,12 @@ services:
|
|||
|
||||
mongo:
|
||||
restart: always
|
||||
image: mongo:6.0
|
||||
image: mongo:5.0
|
||||
container_name: mongo
|
||||
command: '--replSet overleaf'
|
||||
volumes:
|
||||
- ~/mongo_data:/data/db
|
||||
- ./bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
- ./mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
environment:
|
||||
MONGO_INITDB_DATABASE: sharelatex
|
||||
extra_hosts:
|
||||
|
@ -119,7 +115,7 @@ services:
|
|||
# This override is not needed when running the setup after starting up mongo.
|
||||
- mongo:127.0.0.1
|
||||
healthcheck:
|
||||
test: echo 'db.stats().ok' | mongosh localhost:27017/test --quiet
|
||||
test: echo 'db.stats().ok' | mongo localhost:27017/test --quiet
|
||||
interval: 10s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
|
|
1
libraries/access-token-encryptor/.dockerignore
Normal file
1
libraries/access-token-encryptor/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
46
libraries/access-token-encryptor/.gitignore
vendored
Normal file
46
libraries/access-token-encryptor/.gitignore
vendored
Normal file
|
@ -0,0 +1,46 @@
|
|||
compileFolder
|
||||
|
||||
Compiled source #
|
||||
###################
|
||||
*.com
|
||||
*.class
|
||||
*.dll
|
||||
*.exe
|
||||
*.o
|
||||
*.so
|
||||
|
||||
# Packages #
|
||||
############
|
||||
# it's better to unpack these files and commit the raw source
|
||||
# git has its own built in compression methods
|
||||
*.7z
|
||||
*.dmg
|
||||
*.gz
|
||||
*.iso
|
||||
*.jar
|
||||
*.rar
|
||||
*.tar
|
||||
*.zip
|
||||
|
||||
# Logs and databases #
|
||||
######################
|
||||
*.log
|
||||
*.sql
|
||||
*.sqlite
|
||||
|
||||
# OS generated files #
|
||||
######################
|
||||
.DS_Store?
|
||||
ehthumbs.db
|
||||
Icon?
|
||||
Thumbs.db
|
||||
|
||||
/node_modules/*
|
||||
data/*/*
|
||||
|
||||
**.swp
|
||||
|
||||
/log.json
|
||||
hash_folder
|
||||
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
access-token-encryptor
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -21,7 +21,7 @@
|
|||
"devDependencies": {
|
||||
"chai": "^4.3.6",
|
||||
"chai-as-promised": "^7.1.1",
|
||||
"mocha": "^11.1.0",
|
||||
"mocha": "^10.2.0",
|
||||
"sandboxed-module": "^2.0.4",
|
||||
"typescript": "^5.0.4"
|
||||
}
|
||||
|
|
1
libraries/fetch-utils/.dockerignore
Normal file
1
libraries/fetch-utils/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
3
libraries/fetch-utils/.gitignore
vendored
Normal file
3
libraries/fetch-utils/.gitignore
vendored
Normal file
|
@ -0,0 +1,3 @@
|
|||
|
||||
# managed by monorepo$ bin/update_build_scripts
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
fetch-utils
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -23,11 +23,11 @@ async function fetchJson(url, opts = {}) {
|
|||
}
|
||||
|
||||
async function fetchJsonWithResponse(url, opts = {}) {
|
||||
const { fetchOpts, detachSignal } = parseOpts(opts)
|
||||
const { fetchOpts } = parseOpts(opts)
|
||||
fetchOpts.headers = fetchOpts.headers ?? {}
|
||||
fetchOpts.headers.Accept = fetchOpts.headers.Accept ?? 'application/json'
|
||||
|
||||
const response = await performRequest(url, fetchOpts, detachSignal)
|
||||
const response = await performRequest(url, fetchOpts)
|
||||
if (!response.ok) {
|
||||
const body = await maybeGetResponseBody(response)
|
||||
throw new RequestFailedError(url, opts, response, body)
|
||||
|
@ -53,8 +53,8 @@ async function fetchStream(url, opts = {}) {
|
|||
}
|
||||
|
||||
async function fetchStreamWithResponse(url, opts = {}) {
|
||||
const { fetchOpts, abortController, detachSignal } = parseOpts(opts)
|
||||
const response = await performRequest(url, fetchOpts, detachSignal)
|
||||
const { fetchOpts, abortController } = parseOpts(opts)
|
||||
const response = await performRequest(url, fetchOpts)
|
||||
|
||||
if (!response.ok) {
|
||||
const body = await maybeGetResponseBody(response)
|
||||
|
@ -76,8 +76,8 @@ async function fetchStreamWithResponse(url, opts = {}) {
|
|||
* @throws {RequestFailedError} if the response has a failure status code
|
||||
*/
|
||||
async function fetchNothing(url, opts = {}) {
|
||||
const { fetchOpts, detachSignal } = parseOpts(opts)
|
||||
const response = await performRequest(url, fetchOpts, detachSignal)
|
||||
const { fetchOpts } = parseOpts(opts)
|
||||
const response = await performRequest(url, fetchOpts)
|
||||
if (!response.ok) {
|
||||
const body = await maybeGetResponseBody(response)
|
||||
throw new RequestFailedError(url, opts, response, body)
|
||||
|
@ -95,22 +95,9 @@ async function fetchNothing(url, opts = {}) {
|
|||
* @throws {RequestFailedError} if the response has a non redirect status code or missing Location header
|
||||
*/
|
||||
async function fetchRedirect(url, opts = {}) {
|
||||
const { location } = await fetchRedirectWithResponse(url, opts)
|
||||
return location
|
||||
}
|
||||
|
||||
/**
|
||||
* Make a request and extract the redirect from the response.
|
||||
*
|
||||
* @param {string | URL} url - request URL
|
||||
* @param {object} opts - fetch options
|
||||
* @return {Promise<{location: string, response: Response}>}
|
||||
* @throws {RequestFailedError} if the response has a non redirect status code or missing Location header
|
||||
*/
|
||||
async function fetchRedirectWithResponse(url, opts = {}) {
|
||||
const { fetchOpts, detachSignal } = parseOpts(opts)
|
||||
const { fetchOpts } = parseOpts(opts)
|
||||
fetchOpts.redirect = 'manual'
|
||||
const response = await performRequest(url, fetchOpts, detachSignal)
|
||||
const response = await performRequest(url, fetchOpts)
|
||||
if (response.status < 300 || response.status >= 400) {
|
||||
const body = await maybeGetResponseBody(response)
|
||||
throw new RequestFailedError(url, opts, response, body)
|
||||
|
@ -125,7 +112,7 @@ async function fetchRedirectWithResponse(url, opts = {}) {
|
|||
)
|
||||
}
|
||||
await discardResponseBody(response)
|
||||
return { location, response }
|
||||
return location
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -142,8 +129,8 @@ async function fetchString(url, opts = {}) {
|
|||
}
|
||||
|
||||
async function fetchStringWithResponse(url, opts = {}) {
|
||||
const { fetchOpts, detachSignal } = parseOpts(opts)
|
||||
const response = await performRequest(url, fetchOpts, detachSignal)
|
||||
const { fetchOpts } = parseOpts(opts)
|
||||
const response = await performRequest(url, fetchOpts)
|
||||
if (!response.ok) {
|
||||
const body = await maybeGetResponseBody(response)
|
||||
throw new RequestFailedError(url, opts, response, body)
|
||||
|
@ -178,14 +165,13 @@ function parseOpts(opts) {
|
|||
|
||||
const abortController = new AbortController()
|
||||
fetchOpts.signal = abortController.signal
|
||||
let detachSignal = () => {}
|
||||
if (opts.signal) {
|
||||
detachSignal = abortOnSignal(abortController, opts.signal)
|
||||
abortOnSignal(abortController, opts.signal)
|
||||
}
|
||||
if (opts.body instanceof Readable) {
|
||||
abortOnDestroyedRequest(abortController, fetchOpts.body)
|
||||
}
|
||||
return { fetchOpts, abortController, detachSignal }
|
||||
return { fetchOpts, abortController }
|
||||
}
|
||||
|
||||
function setupJsonBody(fetchOpts, json) {
|
||||
|
@ -209,9 +195,6 @@ function abortOnSignal(abortController, signal) {
|
|||
abortController.abort(signal.reason)
|
||||
}
|
||||
signal.addEventListener('abort', listener)
|
||||
return () => {
|
||||
signal.removeEventListener('abort', listener)
|
||||
}
|
||||
}
|
||||
|
||||
function abortOnDestroyedRequest(abortController, stream) {
|
||||
|
@ -230,12 +213,11 @@ function abortOnDestroyedResponse(abortController, response) {
|
|||
})
|
||||
}
|
||||
|
||||
async function performRequest(url, fetchOpts, detachSignal) {
|
||||
async function performRequest(url, fetchOpts) {
|
||||
let response
|
||||
try {
|
||||
response = await fetch(url, fetchOpts)
|
||||
} catch (err) {
|
||||
detachSignal()
|
||||
if (fetchOpts.body instanceof Readable) {
|
||||
fetchOpts.body.destroy()
|
||||
}
|
||||
|
@ -244,7 +226,6 @@ async function performRequest(url, fetchOpts, detachSignal) {
|
|||
method: fetchOpts.method ?? 'GET',
|
||||
})
|
||||
}
|
||||
response.body.on('close', detachSignal)
|
||||
if (fetchOpts.body instanceof Readable) {
|
||||
response.body.on('close', () => {
|
||||
if (!fetchOpts.body.readableEnded) {
|
||||
|
@ -316,7 +297,6 @@ module.exports = {
|
|||
fetchStreamWithResponse,
|
||||
fetchNothing,
|
||||
fetchRedirect,
|
||||
fetchRedirectWithResponse,
|
||||
fetchString,
|
||||
fetchStringWithResponse,
|
||||
RequestFailedError,
|
||||
|
|
|
@ -20,8 +20,8 @@
|
|||
"body-parser": "^1.20.3",
|
||||
"chai": "^4.3.6",
|
||||
"chai-as-promised": "^7.1.1",
|
||||
"express": "^4.21.2",
|
||||
"mocha": "^11.1.0",
|
||||
"express": "^4.21.0",
|
||||
"mocha": "^10.2.0",
|
||||
"typescript": "^5.0.4"
|
||||
},
|
||||
"dependencies": {
|
||||
|
|
|
@ -1,9 +1,6 @@
|
|||
const { expect } = require('chai')
|
||||
const fs = require('node:fs')
|
||||
const events = require('node:events')
|
||||
const { FetchError, AbortError } = require('node-fetch')
|
||||
const { Readable } = require('node:stream')
|
||||
const { pipeline } = require('node:stream/promises')
|
||||
const { once } = require('node:events')
|
||||
const { TestServer } = require('./helpers/TestServer')
|
||||
const selfsigned = require('selfsigned')
|
||||
|
@ -206,31 +203,6 @@ describe('fetch-utils', function () {
|
|||
).to.be.rejectedWith(AbortError)
|
||||
expect(stream.destroyed).to.be.true
|
||||
})
|
||||
|
||||
it('detaches from signal on success', async function () {
|
||||
const signal = AbortSignal.timeout(10_000)
|
||||
for (let i = 0; i < 20; i++) {
|
||||
const s = await fetchStream(this.url('/hello'), { signal })
|
||||
expect(events.getEventListeners(signal, 'abort')).to.have.length(1)
|
||||
await pipeline(s, fs.createWriteStream('/dev/null'))
|
||||
expect(events.getEventListeners(signal, 'abort')).to.have.length(0)
|
||||
}
|
||||
})
|
||||
|
||||
it('detaches from signal on error', async function () {
|
||||
const signal = AbortSignal.timeout(10_000)
|
||||
for (let i = 0; i < 20; i++) {
|
||||
try {
|
||||
await fetchStream(this.url('/500'), { signal })
|
||||
} catch (err) {
|
||||
if (err instanceof RequestFailedError && err.response.status === 500)
|
||||
continue
|
||||
throw err
|
||||
} finally {
|
||||
expect(events.getEventListeners(signal, 'abort')).to.have.length(0)
|
||||
}
|
||||
}
|
||||
})
|
||||
})
|
||||
|
||||
describe('fetchNothing', function () {
|
||||
|
@ -419,16 +391,9 @@ async function* infiniteIterator() {
|
|||
async function abortOnceReceived(func, server) {
|
||||
const controller = new AbortController()
|
||||
const promise = func(controller.signal)
|
||||
expect(events.getEventListeners(controller.signal, 'abort')).to.have.length(1)
|
||||
await once(server.events, 'request-received')
|
||||
controller.abort()
|
||||
try {
|
||||
return await promise
|
||||
} finally {
|
||||
expect(events.getEventListeners(controller.signal, 'abort')).to.have.length(
|
||||
0
|
||||
)
|
||||
}
|
||||
return await promise
|
||||
}
|
||||
|
||||
async function expectRequestAborted(req) {
|
||||
|
|
1
libraries/logger/.dockerignore
Normal file
1
libraries/logger/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
3
libraries/logger/.gitignore
vendored
Normal file
3
libraries/logger/.gitignore
vendored
Normal file
|
@ -0,0 +1,3 @@
|
|||
node_modules
|
||||
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
logger
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -14,10 +14,8 @@ const LoggingManager = {
|
|||
initialize(name) {
|
||||
this.isProduction =
|
||||
(process.env.NODE_ENV || '').toLowerCase() === 'production'
|
||||
const isTest = (process.env.NODE_ENV || '').toLowerCase() === 'test'
|
||||
this.defaultLevel =
|
||||
process.env.LOG_LEVEL ||
|
||||
(this.isProduction ? 'info' : isTest ? 'fatal' : 'debug')
|
||||
process.env.LOG_LEVEL || (this.isProduction ? 'info' : 'debug')
|
||||
this.loggerName = name
|
||||
this.logger = bunyan.createLogger({
|
||||
name,
|
||||
|
|
|
@ -27,7 +27,7 @@
|
|||
},
|
||||
"devDependencies": {
|
||||
"chai": "^4.3.6",
|
||||
"mocha": "^11.1.0",
|
||||
"mocha": "^10.2.0",
|
||||
"sandboxed-module": "^2.0.4",
|
||||
"sinon": "^9.2.4",
|
||||
"sinon-chai": "^3.7.0",
|
||||
|
|
1
libraries/metrics/.dockerignore
Normal file
1
libraries/metrics/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
3
libraries/metrics/.gitignore
vendored
Normal file
3
libraries/metrics/.gitignore
vendored
Normal file
|
@ -0,0 +1,3 @@
|
|||
node_modules
|
||||
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
metrics
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -5,8 +5,6 @@
|
|||
* before any other module to support code instrumentation.
|
||||
*/
|
||||
|
||||
const metricsModuleImportStartTime = performance.now()
|
||||
|
||||
const APP_NAME = process.env.METRICS_APP_NAME || 'unknown'
|
||||
const BUILD_VERSION = process.env.BUILD_VERSION
|
||||
const ENABLE_PROFILE_AGENT = process.env.ENABLE_PROFILE_AGENT === 'true'
|
||||
|
@ -105,5 +103,3 @@ function recordProcessStart() {
|
|||
const metrics = require('.')
|
||||
metrics.inc('process_startup')
|
||||
}
|
||||
|
||||
module.exports = { metricsModuleImportStartTime }
|
||||
|
|
|
@ -9,7 +9,7 @@
|
|||
"main": "index.js",
|
||||
"dependencies": {
|
||||
"@google-cloud/opentelemetry-cloud-trace-exporter": "^2.1.0",
|
||||
"@google-cloud/profiler": "^6.0.3",
|
||||
"@google-cloud/profiler": "^6.0.0",
|
||||
"@opentelemetry/api": "^1.4.1",
|
||||
"@opentelemetry/auto-instrumentations-node": "^0.39.1",
|
||||
"@opentelemetry/exporter-trace-otlp-http": "^0.41.2",
|
||||
|
@ -23,7 +23,7 @@
|
|||
"devDependencies": {
|
||||
"bunyan": "^1.0.0",
|
||||
"chai": "^4.3.6",
|
||||
"mocha": "^11.1.0",
|
||||
"mocha": "^10.2.0",
|
||||
"sandboxed-module": "^2.0.4",
|
||||
"sinon": "^9.2.4",
|
||||
"typescript": "^5.0.4"
|
||||
|
|
1
libraries/mongo-utils/.dockerignore
Normal file
1
libraries/mongo-utils/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
3
libraries/mongo-utils/.gitignore
vendored
Normal file
3
libraries/mongo-utils/.gitignore
vendored
Normal file
|
@ -0,0 +1,3 @@
|
|||
|
||||
# managed by monorepo$ bin/update_build_scripts
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -16,7 +16,6 @@ let VERBOSE_LOGGING
|
|||
let BATCH_RANGE_START
|
||||
let BATCH_RANGE_END
|
||||
let BATCH_MAX_TIME_SPAN_IN_MS
|
||||
let BATCHED_UPDATE_RUNNING = false
|
||||
|
||||
/**
|
||||
* @typedef {import("mongodb").Collection} Collection
|
||||
|
@ -35,7 +34,6 @@ let BATCHED_UPDATE_RUNNING = false
|
|||
* @property {string} [BATCH_RANGE_START]
|
||||
* @property {string} [BATCH_SIZE]
|
||||
* @property {string} [VERBOSE_LOGGING]
|
||||
* @property {(progress: string) => Promise<void>} [trackProgress]
|
||||
*/
|
||||
|
||||
/**
|
||||
|
@ -211,71 +209,59 @@ async function batchedUpdate(
|
|||
update,
|
||||
projection,
|
||||
findOptions,
|
||||
batchedUpdateOptions = {}
|
||||
batchedUpdateOptions
|
||||
) {
|
||||
// only a single batchedUpdate can run at a time due to global variables
|
||||
if (BATCHED_UPDATE_RUNNING) {
|
||||
throw new Error('batchedUpdate is already running')
|
||||
ID_EDGE_PAST = await getIdEdgePast(collection)
|
||||
if (!ID_EDGE_PAST) {
|
||||
console.warn(
|
||||
`The collection ${collection.collectionName} appears to be empty.`
|
||||
)
|
||||
return 0
|
||||
}
|
||||
try {
|
||||
BATCHED_UPDATE_RUNNING = true
|
||||
ID_EDGE_PAST = await getIdEdgePast(collection)
|
||||
if (!ID_EDGE_PAST) {
|
||||
console.warn(
|
||||
`The collection ${collection.collectionName} appears to be empty.`
|
||||
)
|
||||
return 0
|
||||
}
|
||||
refreshGlobalOptionsForBatchedUpdate(batchedUpdateOptions)
|
||||
const { trackProgress = async progress => console.warn(progress) } =
|
||||
batchedUpdateOptions
|
||||
refreshGlobalOptionsForBatchedUpdate(batchedUpdateOptions)
|
||||
|
||||
findOptions = findOptions || {}
|
||||
findOptions.readPreference = READ_PREFERENCE_SECONDARY
|
||||
findOptions = findOptions || {}
|
||||
findOptions.readPreference = READ_PREFERENCE_SECONDARY
|
||||
|
||||
projection = projection || { _id: 1 }
|
||||
let nextBatch
|
||||
let updated = 0
|
||||
let start = BATCH_RANGE_START
|
||||
projection = projection || { _id: 1 }
|
||||
let nextBatch
|
||||
let updated = 0
|
||||
let start = BATCH_RANGE_START
|
||||
|
||||
while (start !== BATCH_RANGE_END) {
|
||||
let end = getNextEnd(start)
|
||||
nextBatch = await getNextBatch(
|
||||
collection,
|
||||
query,
|
||||
start,
|
||||
end,
|
||||
projection,
|
||||
findOptions
|
||||
)
|
||||
if (nextBatch.length > 0) {
|
||||
end = nextBatch[nextBatch.length - 1]._id
|
||||
updated += nextBatch.length
|
||||
while (start !== BATCH_RANGE_END) {
|
||||
let end = getNextEnd(start)
|
||||
nextBatch = await getNextBatch(
|
||||
collection,
|
||||
query,
|
||||
start,
|
||||
end,
|
||||
projection,
|
||||
findOptions
|
||||
)
|
||||
if (nextBatch.length > 0) {
|
||||
end = nextBatch[nextBatch.length - 1]._id
|
||||
updated += nextBatch.length
|
||||
|
||||
if (VERBOSE_LOGGING) {
|
||||
console.log(
|
||||
`Running update on batch with ids ${JSON.stringify(
|
||||
nextBatch.map(entry => entry._id)
|
||||
)}`
|
||||
)
|
||||
}
|
||||
await trackProgress(
|
||||
`Running update on batch ending ${renderObjectId(end)}`
|
||||
if (VERBOSE_LOGGING) {
|
||||
console.log(
|
||||
`Running update on batch with ids ${JSON.stringify(
|
||||
nextBatch.map(entry => entry._id)
|
||||
)}`
|
||||
)
|
||||
|
||||
if (typeof update === 'function') {
|
||||
await update(nextBatch)
|
||||
} else {
|
||||
await performUpdate(collection, nextBatch, update)
|
||||
}
|
||||
} else {
|
||||
console.error(`Running update on batch ending ${renderObjectId(end)}`)
|
||||
}
|
||||
|
||||
if (typeof update === 'function') {
|
||||
await update(nextBatch)
|
||||
} else {
|
||||
await performUpdate(collection, nextBatch, update)
|
||||
}
|
||||
await trackProgress(`Completed batch ending ${renderObjectId(end)}`)
|
||||
start = end
|
||||
}
|
||||
return updated
|
||||
} finally {
|
||||
BATCHED_UPDATE_RUNNING = false
|
||||
console.error(`Completed batch ending ${renderObjectId(end)}`)
|
||||
start = end
|
||||
}
|
||||
return updated
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
mongo-utils
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -16,12 +16,12 @@
|
|||
"author": "Overleaf (https://www.overleaf.com)",
|
||||
"license": "AGPL-3.0-only",
|
||||
"dependencies": {
|
||||
"mongodb": "6.12.0",
|
||||
"mongodb": "6.10.0",
|
||||
"mongodb-legacy": "6.1.3"
|
||||
},
|
||||
"devDependencies": {
|
||||
"chai": "^4.3.6",
|
||||
"mocha": "^11.1.0",
|
||||
"mocha": "^10.2.0",
|
||||
"sandboxed-module": "^2.0.4",
|
||||
"sinon": "^9.2.4",
|
||||
"sinon-chai": "^3.7.0",
|
||||
|
|
1
libraries/o-error/.dockerignore
Normal file
1
libraries/o-error/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
5
libraries/o-error/.gitignore
vendored
Normal file
5
libraries/o-error/.gitignore
vendored
Normal file
|
@ -0,0 +1,5 @@
|
|||
.nyc_output
|
||||
coverage
|
||||
node_modules/
|
||||
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
o-error
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -1,34 +1,20 @@
|
|||
// @ts-check
|
||||
|
||||
/**
|
||||
* Light-weight helpers for handling JavaScript Errors in node.js and the
|
||||
* browser.
|
||||
*/
|
||||
class OError extends Error {
|
||||
/**
|
||||
* The error that is the underlying cause of this error
|
||||
*
|
||||
* @type {unknown}
|
||||
*/
|
||||
cause
|
||||
|
||||
/**
|
||||
* List of errors encountered as the callback chain is unwound
|
||||
*
|
||||
* @type {TaggedError[] | undefined}
|
||||
*/
|
||||
_oErrorTags
|
||||
|
||||
/**
|
||||
* @param {string} message as for built-in Error
|
||||
* @param {Object} [info] extra data to attach to the error
|
||||
* @param {unknown} [cause] the internal error that caused this error
|
||||
* @param {Error} [cause] the internal error that caused this error
|
||||
*/
|
||||
constructor(message, info, cause) {
|
||||
super(message)
|
||||
this.name = this.constructor.name
|
||||
if (info) this.info = info
|
||||
if (cause) this.cause = cause
|
||||
/** @private @type {Array<TaggedError> | undefined} */
|
||||
this._oErrorTags // eslint-disable-line
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -45,7 +31,7 @@ class OError extends Error {
|
|||
/**
|
||||
* Wrap the given error, which caused this error.
|
||||
*
|
||||
* @param {unknown} cause the internal error that caused this error
|
||||
* @param {Error} cause the internal error that caused this error
|
||||
* @return {this}
|
||||
*/
|
||||
withCause(cause) {
|
||||
|
@ -79,16 +65,13 @@ class OError extends Error {
|
|||
* }
|
||||
* }
|
||||
*
|
||||
* @template {unknown} E
|
||||
* @param {E} error the error to tag
|
||||
* @param {Error} error the error to tag
|
||||
* @param {string} [message] message with which to tag `error`
|
||||
* @param {Object} [info] extra data with wich to tag `error`
|
||||
* @return {E} the modified `error` argument
|
||||
* @return {Error} the modified `error` argument
|
||||
*/
|
||||
static tag(error, message, info) {
|
||||
const oError = /** @type {{ _oErrorTags: TaggedError[] | undefined }} */ (
|
||||
error
|
||||
)
|
||||
const oError = /** @type{OError} */ (error)
|
||||
|
||||
if (!oError._oErrorTags) oError._oErrorTags = []
|
||||
|
||||
|
@ -119,7 +102,7 @@ class OError extends Error {
|
|||
*
|
||||
* If an info property is repeated, the last one wins.
|
||||
*
|
||||
* @param {unknown} error any error (may or may not be an `OError`)
|
||||
* @param {Error | null | undefined} error any error (may or may not be an `OError`)
|
||||
* @return {Object}
|
||||
*/
|
||||
static getFullInfo(error) {
|
||||
|
@ -146,7 +129,7 @@ class OError extends Error {
|
|||
* Return the `stack` property from `error`, including the `stack`s for any
|
||||
* tagged errors added with `OError.tag` and for any `cause`s.
|
||||
*
|
||||
* @param {unknown} error any error (may or may not be an `OError`)
|
||||
* @param {Error | null | undefined} error any error (may or may not be an `OError`)
|
||||
* @return {string}
|
||||
*/
|
||||
static getFullStack(error) {
|
||||
|
@ -160,7 +143,7 @@ class OError extends Error {
|
|||
stack += `\n${oError._oErrorTags.map(tag => tag.stack).join('\n')}`
|
||||
}
|
||||
|
||||
const causeStack = OError.getFullStack(oError.cause)
|
||||
const causeStack = oError.cause && OError.getFullStack(oError.cause)
|
||||
if (causeStack) {
|
||||
stack += '\ncaused by:\n' + indent(causeStack)
|
||||
}
|
||||
|
|
|
@ -34,7 +34,7 @@
|
|||
"@types/chai": "^4.3.0",
|
||||
"@types/node": "^18.17.4",
|
||||
"chai": "^4.3.6",
|
||||
"mocha": "^11.1.0",
|
||||
"mocha": "^10.2.0",
|
||||
"typescript": "^5.0.4"
|
||||
}
|
||||
}
|
||||
|
|
|
@ -268,11 +268,6 @@ describe('utils', function () {
|
|||
expect(OError.getFullInfo(null)).to.deep.equal({})
|
||||
})
|
||||
|
||||
it('works when given a string', function () {
|
||||
const err = 'not an error instance'
|
||||
expect(OError.getFullInfo(err)).to.deep.equal({})
|
||||
})
|
||||
|
||||
it('works on a normal error', function () {
|
||||
const err = new Error('foo')
|
||||
expect(OError.getFullInfo(err)).to.deep.equal({})
|
||||
|
|
|
@ -35,14 +35,6 @@ describe('OError', function () {
|
|||
expect(err2.cause.message).to.equal('cause 2')
|
||||
})
|
||||
|
||||
it('accepts non-Error causes', function () {
|
||||
const err1 = new OError('foo', {}, 'not-an-error')
|
||||
expect(err1.cause).to.equal('not-an-error')
|
||||
|
||||
const err2 = new OError('foo').withCause('not-an-error')
|
||||
expect(err2.cause).to.equal('not-an-error')
|
||||
})
|
||||
|
||||
it('handles a custom error type with a cause', function () {
|
||||
function doSomethingBadInternally() {
|
||||
throw new Error('internal error')
|
||||
|
|
1
libraries/object-persistor/.dockerignore
Normal file
1
libraries/object-persistor/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
4
libraries/object-persistor/.gitignore
vendored
Normal file
4
libraries/object-persistor/.gitignore
vendored
Normal file
|
@ -0,0 +1,4 @@
|
|||
/node_modules
|
||||
*.swp
|
||||
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
object-persistor
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -34,9 +34,9 @@
|
|||
"devDependencies": {
|
||||
"chai": "^4.3.6",
|
||||
"chai-as-promised": "^7.1.1",
|
||||
"mocha": "^11.1.0",
|
||||
"mocha": "^10.2.0",
|
||||
"mock-fs": "^5.2.0",
|
||||
"mongodb": "6.12.0",
|
||||
"mongodb": "6.10.0",
|
||||
"sandboxed-module": "^2.0.4",
|
||||
"sinon": "^9.2.4",
|
||||
"sinon-chai": "^3.7.0",
|
||||
|
|
|
@ -305,10 +305,8 @@ module.exports = class FSPersistor extends AbstractPersistor {
|
|||
|
||||
async _listDirectory(path) {
|
||||
if (this.useSubdirectories) {
|
||||
// eslint-disable-next-line @typescript-eslint/return-await
|
||||
return await glob(Path.join(path, '**'))
|
||||
} else {
|
||||
// eslint-disable-next-line @typescript-eslint/return-await
|
||||
return await glob(`${path}_*`)
|
||||
}
|
||||
}
|
||||
|
|
|
@ -33,10 +33,6 @@ const AES256_KEY_LENGTH = 32
|
|||
* @property {() => Promise<Array<RootKeyEncryptionKey>>} getRootKeyEncryptionKeys
|
||||
*/
|
||||
|
||||
/**
|
||||
* @typedef {import('./types').ListDirectoryResult} ListDirectoryResult
|
||||
*/
|
||||
|
||||
/**
|
||||
* Helper function to make TS happy when accessing error properties
|
||||
* AWSError is not an actual class, so we cannot use instanceof.
|
||||
|
@ -347,10 +343,9 @@ class PerProjectEncryptedS3Persistor extends S3Persistor {
|
|||
}
|
||||
|
||||
async deleteDirectory(bucketName, path, continuationToken) {
|
||||
// Let [Settings.pathToProjectFolder] validate the project path before deleting things.
|
||||
const { projectFolder, dekPath } = this.#buildProjectPaths(bucketName, path)
|
||||
// Note: Listing/Deleting a prefix does not require SSE-C credentials.
|
||||
await super.deleteDirectory(bucketName, path, continuationToken)
|
||||
const { projectFolder, dekPath } = this.#buildProjectPaths(bucketName, path)
|
||||
if (projectFolder === path) {
|
||||
await super.deleteObject(
|
||||
this.#settings.dataEncryptionKeyBucketName,
|
||||
|
@ -395,9 +390,9 @@ class PerProjectEncryptedS3Persistor extends S3Persistor {
|
|||
* A general "cache" for project keys is another alternative. For now, use a helper class.
|
||||
*/
|
||||
class CachedPerProjectEncryptedS3Persistor {
|
||||
/** @type SSECOptions */
|
||||
/** @type SSECOptions */
|
||||
#projectKeyOptions
|
||||
/** @type PerProjectEncryptedS3Persistor */
|
||||
/** @type PerProjectEncryptedS3Persistor */
|
||||
#parent
|
||||
|
||||
/**
|
||||
|
@ -418,26 +413,6 @@ class CachedPerProjectEncryptedS3Persistor {
|
|||
return await this.sendStream(bucketName, path, fs.createReadStream(fsPath))
|
||||
}
|
||||
|
||||
/**
|
||||
*
|
||||
* @param {string} bucketName
|
||||
* @param {string} path
|
||||
* @return {Promise<number>}
|
||||
*/
|
||||
async getObjectSize(bucketName, path) {
|
||||
return await this.#parent.getObjectSize(bucketName, path)
|
||||
}
|
||||
|
||||
/**
|
||||
*
|
||||
* @param {string} bucketName
|
||||
* @param {string} path
|
||||
* @return {Promise<ListDirectoryResult>}
|
||||
*/
|
||||
async listDirectory(bucketName, path) {
|
||||
return await this.#parent.listDirectory(bucketName, path)
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {string} bucketName
|
||||
* @param {string} path
|
||||
|
|
|
@ -20,18 +20,6 @@ const { URL } = require('node:url')
|
|||
const { WriteError, ReadError, NotFoundError } = require('./Errors')
|
||||
const zlib = require('node:zlib')
|
||||
|
||||
/**
|
||||
* @typedef {import('aws-sdk/clients/s3').ListObjectsV2Output} ListObjectsV2Output
|
||||
*/
|
||||
|
||||
/**
|
||||
* @typedef {import('aws-sdk/clients/s3').Object} S3Object
|
||||
*/
|
||||
|
||||
/**
|
||||
* @typedef {import('./types').ListDirectoryResult} ListDirectoryResult
|
||||
*/
|
||||
|
||||
/**
|
||||
* Wrapper with private fields to avoid revealing them on console, JSON.stringify or similar.
|
||||
*/
|
||||
|
@ -278,12 +266,26 @@ class S3Persistor extends AbstractPersistor {
|
|||
* @return {Promise<void>}
|
||||
*/
|
||||
async deleteDirectory(bucketName, key, continuationToken) {
|
||||
const { contents, response } = await this.listDirectory(
|
||||
bucketName,
|
||||
key,
|
||||
continuationToken
|
||||
)
|
||||
const objects = contents.map(item => ({ Key: item.Key || '' }))
|
||||
let response
|
||||
const options = { Bucket: bucketName, Prefix: key }
|
||||
if (continuationToken) {
|
||||
options.ContinuationToken = continuationToken
|
||||
}
|
||||
|
||||
try {
|
||||
response = await this._getClientForBucket(bucketName)
|
||||
.listObjectsV2(options)
|
||||
.promise()
|
||||
} catch (err) {
|
||||
throw PersistorHelper.wrapError(
|
||||
err,
|
||||
'failed to list objects in S3',
|
||||
{ bucketName, key },
|
||||
ReadError
|
||||
)
|
||||
}
|
||||
|
||||
const objects = response.Contents?.map(item => ({ Key: item.Key || '' }))
|
||||
if (objects?.length) {
|
||||
try {
|
||||
await this._getClientForBucket(bucketName)
|
||||
|
@ -314,36 +316,6 @@ class S3Persistor extends AbstractPersistor {
|
|||
}
|
||||
}
|
||||
|
||||
/**
|
||||
*
|
||||
* @param {string} bucketName
|
||||
* @param {string} key
|
||||
* @param {string} [continuationToken]
|
||||
* @return {Promise<ListDirectoryResult>}
|
||||
*/
|
||||
async listDirectory(bucketName, key, continuationToken) {
|
||||
let response
|
||||
const options = { Bucket: bucketName, Prefix: key }
|
||||
if (continuationToken) {
|
||||
options.ContinuationToken = continuationToken
|
||||
}
|
||||
|
||||
try {
|
||||
response = await this._getClientForBucket(bucketName)
|
||||
.listObjectsV2(options)
|
||||
.promise()
|
||||
} catch (err) {
|
||||
throw PersistorHelper.wrapError(
|
||||
err,
|
||||
'failed to list objects in S3',
|
||||
{ bucketName, key },
|
||||
ReadError
|
||||
)
|
||||
}
|
||||
|
||||
return { contents: response.Contents ?? [], response }
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {string} bucketName
|
||||
* @param {string} key
|
||||
|
|
6
libraries/object-persistor/src/types.d.ts
vendored
6
libraries/object-persistor/src/types.d.ts
vendored
|
@ -1,6 +0,0 @@
|
|||
import type { ListObjectsV2Output, Object } from 'aws-sdk/clients/s3'
|
||||
|
||||
export type ListDirectoryResult = {
|
||||
contents: Array<Object>
|
||||
response: ListObjectsV2Output
|
||||
}
|
1
libraries/overleaf-editor-core/.dockerignore
Normal file
1
libraries/overleaf-editor-core/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
5
libraries/overleaf-editor-core/.gitignore
vendored
Normal file
5
libraries/overleaf-editor-core/.gitignore
vendored
Normal file
|
@ -0,0 +1,5 @@
|
|||
/coverage
|
||||
/node_modules
|
||||
|
||||
# managed by monorepo$ bin/update_build_scripts
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
overleaf-editor-core
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -18,7 +18,6 @@ const MoveFileOperation = require('./lib/operation/move_file_operation')
|
|||
const SetCommentStateOperation = require('./lib/operation/set_comment_state_operation')
|
||||
const EditFileOperation = require('./lib/operation/edit_file_operation')
|
||||
const EditNoOperation = require('./lib/operation/edit_no_operation')
|
||||
const EditOperationTransformer = require('./lib/operation/edit_operation_transformer')
|
||||
const SetFileMetadataOperation = require('./lib/operation/set_file_metadata_operation')
|
||||
const NoOperation = require('./lib/operation/no_operation')
|
||||
const Operation = require('./lib/operation')
|
||||
|
@ -44,8 +43,6 @@ const TrackingProps = require('./lib/file_data/tracking_props')
|
|||
const Range = require('./lib/range')
|
||||
const CommentList = require('./lib/file_data/comment_list')
|
||||
const LazyStringFileData = require('./lib/file_data/lazy_string_file_data')
|
||||
const StringFileData = require('./lib/file_data/string_file_data')
|
||||
const EditOperationBuilder = require('./lib/operation/edit_operation_builder')
|
||||
|
||||
exports.AddCommentOperation = AddCommentOperation
|
||||
exports.Author = Author
|
||||
|
@ -61,7 +58,6 @@ exports.DeleteCommentOperation = DeleteCommentOperation
|
|||
exports.File = File
|
||||
exports.FileMap = FileMap
|
||||
exports.LazyStringFileData = LazyStringFileData
|
||||
exports.StringFileData = StringFileData
|
||||
exports.History = History
|
||||
exports.Label = Label
|
||||
exports.AddFileOperation = AddFileOperation
|
||||
|
@ -69,8 +65,6 @@ exports.MoveFileOperation = MoveFileOperation
|
|||
exports.SetCommentStateOperation = SetCommentStateOperation
|
||||
exports.EditFileOperation = EditFileOperation
|
||||
exports.EditNoOperation = EditNoOperation
|
||||
exports.EditOperationBuilder = EditOperationBuilder
|
||||
exports.EditOperationTransformer = EditOperationTransformer
|
||||
exports.SetFileMetadataOperation = SetFileMetadataOperation
|
||||
exports.NoOperation = NoOperation
|
||||
exports.Operation = Operation
|
||||
|
|
|
@ -13,7 +13,7 @@ const V2DocVersions = require('./v2_doc_versions')
|
|||
|
||||
/**
|
||||
* @import Author from "./author"
|
||||
* @import { BlobStore, RawChange, ReadonlyBlobStore } from "./types"
|
||||
* @import { BlobStore } from "./types"
|
||||
*/
|
||||
|
||||
/**
|
||||
|
@ -54,7 +54,7 @@ class Change {
|
|||
/**
|
||||
* For serialization.
|
||||
*
|
||||
* @return {RawChange}
|
||||
* @return {Object}
|
||||
*/
|
||||
toRaw() {
|
||||
function toRaw(object) {
|
||||
|
@ -100,9 +100,6 @@ class Change {
|
|||
)
|
||||
}
|
||||
|
||||
/**
|
||||
* @return {Operation[]}
|
||||
*/
|
||||
getOperations() {
|
||||
return this.operations
|
||||
}
|
||||
|
@ -219,7 +216,7 @@ class Change {
|
|||
* If this Change contains any File objects, load them.
|
||||
*
|
||||
* @param {string} kind see {File#load}
|
||||
* @param {ReadonlyBlobStore} blobStore
|
||||
* @param {BlobStore} blobStore
|
||||
* @return {Promise<void>}
|
||||
*/
|
||||
async loadFiles(kind, blobStore) {
|
||||
|
@ -251,24 +248,6 @@ class Change {
|
|||
* @param {boolean} [opts.strict] - Do not ignore recoverable errors
|
||||
*/
|
||||
applyTo(snapshot, opts = {}) {
|
||||
// eslint-disable-next-line no-unused-vars
|
||||
for (const operation of this.iterativelyApplyTo(snapshot, opts)) {
|
||||
// Nothing to do: we're just consuming the iterator for the side effects
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Generator that applies this change to a snapshot and yields each
|
||||
* operation after it has been applied.
|
||||
*
|
||||
* Recoverable errors (caused by historical bad data) are ignored unless
|
||||
* opts.strict is true
|
||||
*
|
||||
* @param {Snapshot} snapshot modified in place
|
||||
* @param {object} opts
|
||||
* @param {boolean} [opts.strict] - Do not ignore recoverable errors
|
||||
*/
|
||||
*iterativelyApplyTo(snapshot, opts = {}) {
|
||||
assert.object(snapshot, 'bad snapshot')
|
||||
|
||||
for (const operation of this.operations) {
|
||||
|
@ -282,7 +261,6 @@ class Change {
|
|||
throw err
|
||||
}
|
||||
}
|
||||
yield operation
|
||||
}
|
||||
|
||||
// update project version if present in change
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
// @ts-check
|
||||
|
||||
/**
|
||||
* @import { ClearTrackingPropsRawData, TrackingDirective } from '../types'
|
||||
* @import { ClearTrackingPropsRawData } from '../types'
|
||||
*/
|
||||
|
||||
class ClearTrackingProps {
|
||||
|
@ -11,27 +11,12 @@ class ClearTrackingProps {
|
|||
|
||||
/**
|
||||
* @param {any} other
|
||||
* @returns {other is ClearTrackingProps}
|
||||
* @returns {boolean}
|
||||
*/
|
||||
equals(other) {
|
||||
return other instanceof ClearTrackingProps
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {TrackingDirective} other
|
||||
* @returns {other is ClearTrackingProps}
|
||||
*/
|
||||
canMergeWith(other) {
|
||||
return other instanceof ClearTrackingProps
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {TrackingDirective} other
|
||||
*/
|
||||
mergeWith(other) {
|
||||
return this
|
||||
}
|
||||
|
||||
/**
|
||||
* @returns {ClearTrackingPropsRawData}
|
||||
*/
|
||||
|
|
|
@ -11,7 +11,7 @@ const EditOperation = require('../operation/edit_operation')
|
|||
const EditOperationBuilder = require('../operation/edit_operation_builder')
|
||||
|
||||
/**
|
||||
* @import { BlobStore, ReadonlyBlobStore, RangesBlob, RawHashFileData, RawLazyStringFileData } from '../types'
|
||||
* @import { BlobStore, ReadonlyBlobStore, RangesBlob, RawFileData, RawLazyStringFileData } from '../types'
|
||||
*/
|
||||
|
||||
class LazyStringFileData extends FileData {
|
||||
|
@ -159,11 +159,11 @@ class LazyStringFileData extends FileData {
|
|||
|
||||
/** @inheritdoc
|
||||
* @param {BlobStore} blobStore
|
||||
* @return {Promise<RawHashFileData>}
|
||||
* @return {Promise<RawFileData>}
|
||||
*/
|
||||
async store(blobStore) {
|
||||
if (this.operations.length === 0) {
|
||||
/** @type RawHashFileData */
|
||||
/** @type RawFileData */
|
||||
const raw = { hash: this.hash }
|
||||
if (this.rangesHash) {
|
||||
raw.rangesHash = this.rangesHash
|
||||
|
@ -171,11 +171,9 @@ class LazyStringFileData extends FileData {
|
|||
return raw
|
||||
}
|
||||
const eager = await this.toEager(blobStore)
|
||||
const raw = await eager.store(blobStore)
|
||||
this.hash = raw.hash
|
||||
this.rangesHash = raw.rangesHash
|
||||
this.operations.length = 0
|
||||
return raw
|
||||
/** @type RawFileData */
|
||||
return await eager.store(blobStore)
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -8,7 +8,7 @@ const CommentList = require('./comment_list')
|
|||
const TrackedChangeList = require('./tracked_change_list')
|
||||
|
||||
/**
|
||||
* @import { StringFileRawData, RawHashFileData, BlobStore, CommentRawData } from "../types"
|
||||
* @import { StringFileRawData, RawFileData, BlobStore, CommentRawData } from "../types"
|
||||
* @import { TrackedChangeRawData, RangesBlob } from "../types"
|
||||
* @import EditOperation from "../operation/edit_operation"
|
||||
*/
|
||||
|
@ -88,14 +88,6 @@ class StringFileData extends FileData {
|
|||
return content
|
||||
}
|
||||
|
||||
/**
|
||||
* Return docstore view of a doc: each line separated
|
||||
* @return {string[]}
|
||||
*/
|
||||
getLines() {
|
||||
return this.getContent({ filterTrackedDeletes: true }).split('\n')
|
||||
}
|
||||
|
||||
/** @inheritdoc */
|
||||
getByteLength() {
|
||||
return Buffer.byteLength(this.content)
|
||||
|
@ -139,7 +131,7 @@ class StringFileData extends FileData {
|
|||
/**
|
||||
* @inheritdoc
|
||||
* @param {BlobStore} blobStore
|
||||
* @return {Promise<RawHashFileData>}
|
||||
* @return {Promise<RawFileData>}
|
||||
*/
|
||||
async store(blobStore) {
|
||||
const blob = await blobStore.putString(this.content)
|
||||
|
|
|
@ -84,21 +84,6 @@ class TrackedChange {
|
|||
)
|
||||
)
|
||||
}
|
||||
|
||||
/**
|
||||
* Return an equivalent tracked change whose extent is limited to the given
|
||||
* range
|
||||
*
|
||||
* @param {Range} range
|
||||
* @returns {TrackedChange | null} - the result or null if the intersection is empty
|
||||
*/
|
||||
intersectRange(range) {
|
||||
const intersection = this.range.intersect(range)
|
||||
if (intersection == null) {
|
||||
return null
|
||||
}
|
||||
return new TrackedChange(intersection, this.tracking)
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = TrackedChange
|
||||
|
|
|
@ -2,11 +2,9 @@
|
|||
const Range = require('../range')
|
||||
const TrackedChange = require('./tracked_change')
|
||||
const TrackingProps = require('../file_data/tracking_props')
|
||||
const { InsertOp, RemoveOp, RetainOp } = require('../operation/scan_op')
|
||||
|
||||
/**
|
||||
* @import { TrackingDirective, TrackedChangeRawData } from "../types"
|
||||
* @import TextOperation from "../operation/text_operation"
|
||||
*/
|
||||
|
||||
class TrackedChangeList {
|
||||
|
@ -60,22 +58,6 @@ class TrackedChangeList {
|
|||
return this._trackedChanges.filter(change => range.contains(change.range))
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns tracked changes that overlap with the given range
|
||||
* @param {Range} range
|
||||
* @returns {TrackedChange[]}
|
||||
*/
|
||||
intersectRange(range) {
|
||||
const changes = []
|
||||
for (const change of this._trackedChanges) {
|
||||
const intersection = change.intersectRange(range)
|
||||
if (intersection != null) {
|
||||
changes.push(intersection)
|
||||
}
|
||||
}
|
||||
return changes
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the tracking props for a given range.
|
||||
* @param {Range} range
|
||||
|
@ -107,8 +89,6 @@ class TrackedChangeList {
|
|||
|
||||
/**
|
||||
* Collapses consecutive (and compatible) ranges
|
||||
*
|
||||
* @private
|
||||
* @returns {void}
|
||||
*/
|
||||
_mergeRanges() {
|
||||
|
@ -137,28 +117,12 @@ class TrackedChangeList {
|
|||
}
|
||||
|
||||
/**
|
||||
* Apply an insert operation
|
||||
*
|
||||
* @param {number} cursor
|
||||
* @param {string} insertedText
|
||||
* @param {{tracking?: TrackingProps}} opts
|
||||
*/
|
||||
applyInsert(cursor, insertedText, opts = {}) {
|
||||
this._applyInsert(cursor, insertedText, opts)
|
||||
this._mergeRanges()
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply an insert operation
|
||||
*
|
||||
* This method will not merge ranges at the end
|
||||
*
|
||||
* @private
|
||||
* @param {number} cursor
|
||||
* @param {string} insertedText
|
||||
* @param {{tracking?: TrackingProps}} [opts]
|
||||
*/
|
||||
_applyInsert(cursor, insertedText, opts = {}) {
|
||||
const newTrackedChanges = []
|
||||
for (const trackedChange of this._trackedChanges) {
|
||||
if (
|
||||
|
@ -207,29 +171,15 @@ class TrackedChangeList {
|
|||
newTrackedChanges.push(newTrackedChange)
|
||||
}
|
||||
this._trackedChanges = newTrackedChanges
|
||||
this._mergeRanges()
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply a delete operation to the list of tracked changes
|
||||
*
|
||||
* @param {number} cursor
|
||||
* @param {number} length
|
||||
*/
|
||||
applyDelete(cursor, length) {
|
||||
this._applyDelete(cursor, length)
|
||||
this._mergeRanges()
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply a delete operation to the list of tracked changes
|
||||
*
|
||||
* This method will not merge ranges at the end
|
||||
*
|
||||
* @private
|
||||
* @param {number} cursor
|
||||
* @param {number} length
|
||||
*/
|
||||
_applyDelete(cursor, length) {
|
||||
const newTrackedChanges = []
|
||||
for (const trackedChange of this._trackedChanges) {
|
||||
const deletedRange = new Range(cursor, length)
|
||||
|
@ -255,31 +205,15 @@ class TrackedChangeList {
|
|||
}
|
||||
}
|
||||
this._trackedChanges = newTrackedChanges
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply a retain operation to the list of tracked changes
|
||||
*
|
||||
* @param {number} cursor
|
||||
* @param {number} length
|
||||
* @param {{tracking?: TrackingDirective}} [opts]
|
||||
*/
|
||||
applyRetain(cursor, length, opts = {}) {
|
||||
this._applyRetain(cursor, length, opts)
|
||||
this._mergeRanges()
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply a retain operation to the list of tracked changes
|
||||
*
|
||||
* This method will not merge ranges at the end
|
||||
*
|
||||
* @private
|
||||
* @param {number} cursor
|
||||
* @param {number} length
|
||||
* @param {{tracking?: TrackingDirective}} opts
|
||||
*/
|
||||
_applyRetain(cursor, length, opts = {}) {
|
||||
applyRetain(cursor, length, opts = {}) {
|
||||
// If there's no tracking info, leave everything as-is
|
||||
if (!opts.tracking) {
|
||||
return
|
||||
|
@ -335,31 +269,6 @@ class TrackedChangeList {
|
|||
newTrackedChanges.push(newTrackedChange)
|
||||
}
|
||||
this._trackedChanges = newTrackedChanges
|
||||
}
|
||||
|
||||
/**
|
||||
* Apply a text operation to the list of tracked changes
|
||||
*
|
||||
* Ranges are merged only once at the end, for performance and to avoid
|
||||
* problematic edge cases where intermediate ranges get incorrectly merged.
|
||||
*
|
||||
* @param {TextOperation} operation
|
||||
*/
|
||||
applyTextOperation(operation) {
|
||||
// this cursor tracks the destination document that gets modified as
|
||||
// operations are applied to it.
|
||||
let cursor = 0
|
||||
for (const op of operation.ops) {
|
||||
if (op instanceof InsertOp) {
|
||||
this._applyInsert(cursor, op.insertion, { tracking: op.tracking })
|
||||
cursor += op.insertion.length
|
||||
} else if (op instanceof RemoveOp) {
|
||||
this._applyDelete(cursor, op.length)
|
||||
} else if (op instanceof RetainOp) {
|
||||
this._applyRetain(cursor, op.length, { tracking: op.tracking })
|
||||
cursor += op.length
|
||||
}
|
||||
}
|
||||
this._mergeRanges()
|
||||
}
|
||||
}
|
||||
|
|
|
@ -62,35 +62,6 @@ class TrackingProps {
|
|||
this.ts.getTime() === other.ts.getTime()
|
||||
)
|
||||
}
|
||||
|
||||
/**
|
||||
* Are these tracking props compatible with the other tracking props for merging
|
||||
* ranges?
|
||||
*
|
||||
* @param {TrackingDirective} other
|
||||
* @returns {other is TrackingProps}
|
||||
*/
|
||||
canMergeWith(other) {
|
||||
if (!(other instanceof TrackingProps)) {
|
||||
return false
|
||||
}
|
||||
return this.type === other.type && this.userId === other.userId
|
||||
}
|
||||
|
||||
/**
|
||||
* Merge two tracking props
|
||||
*
|
||||
* Assumes that `canMerge(other)` returns true
|
||||
*
|
||||
* @param {TrackingDirective} other
|
||||
*/
|
||||
mergeWith(other) {
|
||||
if (!this.canMergeWith(other)) {
|
||||
throw new Error('Cannot merge with incompatible tracking props')
|
||||
}
|
||||
const ts = this.ts <= other.ts ? this.ts : other.ts
|
||||
return new TrackingProps(this.type, this.userId, ts)
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = TrackingProps
|
||||
|
|
|
@ -22,7 +22,7 @@ class NonUniquePathnameError extends PathnameError {
|
|||
* @param {string[]} pathnames
|
||||
*/
|
||||
constructor(pathnames) {
|
||||
super('pathnames are not unique', { pathnames })
|
||||
super('pathnames are not unique: ' + pathnames, { pathnames })
|
||||
this.pathnames = pathnames
|
||||
}
|
||||
}
|
||||
|
@ -30,13 +30,9 @@ class NonUniquePathnameError extends PathnameError {
|
|||
class BadPathnameError extends PathnameError {
|
||||
/**
|
||||
* @param {string} pathname
|
||||
* @param {string} reason
|
||||
*/
|
||||
constructor(pathname, reason) {
|
||||
if (pathname.length > 10) {
|
||||
pathname = pathname.slice(0, 5) + '...' + pathname.slice(-5)
|
||||
}
|
||||
super('invalid pathname', { reason, pathname })
|
||||
constructor(pathname) {
|
||||
super(pathname + ' is not a valid pathname', { pathname })
|
||||
this.pathname = pathname
|
||||
}
|
||||
}
|
||||
|
@ -46,7 +42,7 @@ class PathnameConflictError extends PathnameError {
|
|||
* @param {string} pathname
|
||||
*/
|
||||
constructor(pathname) {
|
||||
super('pathname conflicts with another file', { pathname })
|
||||
super(`pathname '${pathname}' conflicts with another file`, { pathname })
|
||||
this.pathname = pathname
|
||||
}
|
||||
}
|
||||
|
@ -56,7 +52,7 @@ class FileNotFoundError extends PathnameError {
|
|||
* @param {string} pathname
|
||||
*/
|
||||
constructor(pathname) {
|
||||
super('file does not exist', { pathname })
|
||||
super(`file ${pathname} does not exist`, { pathname })
|
||||
this.pathname = pathname
|
||||
}
|
||||
}
|
||||
|
@ -319,9 +315,8 @@ function checkPathnamesAreUnique(files) {
|
|||
*/
|
||||
function checkPathname(pathname) {
|
||||
assert.nonEmptyString(pathname, 'bad pathname')
|
||||
const [isClean, reason] = safePathname.isCleanDebug(pathname)
|
||||
if (isClean) return
|
||||
throw new FileMap.BadPathnameError(pathname, reason)
|
||||
if (safePathname.isClean(pathname)) return
|
||||
throw new FileMap.BadPathnameError(pathname)
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
|
@ -7,7 +7,7 @@ const Change = require('./change')
|
|||
const Snapshot = require('./snapshot')
|
||||
|
||||
/**
|
||||
* @import { BlobStore, ReadonlyBlobStore } from "./types"
|
||||
* @import { BlobStore } from "./types"
|
||||
*/
|
||||
|
||||
class History {
|
||||
|
@ -85,7 +85,7 @@ class History {
|
|||
* If this History contains any File objects, load them.
|
||||
*
|
||||
* @param {string} kind see {File#load}
|
||||
* @param {ReadonlyBlobStore} blobStore
|
||||
* @param {BlobStore} blobStore
|
||||
* @return {Promise<void>}
|
||||
*/
|
||||
async loadFiles(kind, blobStore) {
|
||||
|
|
|
@ -36,20 +36,6 @@ class EditOperationBuilder {
|
|||
}
|
||||
throw new Error('Unsupported operation in EditOperationBuilder.fromJSON')
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {unknown} raw
|
||||
* @return {raw is RawEditOperation}
|
||||
*/
|
||||
static isValid(raw) {
|
||||
return (
|
||||
isTextOperation(raw) ||
|
||||
isRawAddCommentOperation(raw) ||
|
||||
isRawDeleteCommentOperation(raw) ||
|
||||
isRawSetCommentStateOperation(raw) ||
|
||||
isRawEditNoOperation(raw)
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
|
@ -13,7 +13,7 @@ let EditFileOperation = null
|
|||
let SetFileMetadataOperation = null
|
||||
|
||||
/**
|
||||
* @import { ReadonlyBlobStore } from "../types"
|
||||
* @import { BlobStore } from "../types"
|
||||
* @import Snapshot from "../snapshot"
|
||||
*/
|
||||
|
||||
|
@ -80,7 +80,7 @@ class Operation {
|
|||
* If this operation references any files, load the files.
|
||||
*
|
||||
* @param {string} kind see {File#load}
|
||||
* @param {ReadOnlyBlobStore} blobStore
|
||||
* @param {BlobStore} blobStore
|
||||
* @return {Promise<void>}
|
||||
*/
|
||||
async loadFiles(kind, blobStore) {}
|
||||
|
|
|
@ -175,7 +175,7 @@ class InsertOp extends ScanOp {
|
|||
return false
|
||||
}
|
||||
if (this.tracking) {
|
||||
if (!other.tracking || !this.tracking.canMergeWith(other.tracking)) {
|
||||
if (!this.tracking.equals(other.tracking)) {
|
||||
return false
|
||||
}
|
||||
} else if (other.tracking) {
|
||||
|
@ -198,10 +198,7 @@ class InsertOp extends ScanOp {
|
|||
throw new Error('Cannot merge with incompatible operation')
|
||||
}
|
||||
this.insertion += other.insertion
|
||||
if (this.tracking != null && other.tracking != null) {
|
||||
this.tracking = this.tracking.mergeWith(other.tracking)
|
||||
}
|
||||
// We already have the same commentIds
|
||||
// We already have the same tracking info and commentIds
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -309,13 +306,9 @@ class RetainOp extends ScanOp {
|
|||
return false
|
||||
}
|
||||
if (this.tracking) {
|
||||
if (!other.tracking || !this.tracking.canMergeWith(other.tracking)) {
|
||||
return false
|
||||
}
|
||||
} else if (other.tracking) {
|
||||
return false
|
||||
return this.tracking.equals(other.tracking)
|
||||
}
|
||||
return true
|
||||
return !other.tracking
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -326,9 +319,6 @@ class RetainOp extends ScanOp {
|
|||
throw new Error('Cannot merge with incompatible operation')
|
||||
}
|
||||
this.length += other.length
|
||||
if (this.tracking != null && other.tracking != null) {
|
||||
this.tracking = this.tracking.mergeWith(other.tracking)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
|
@ -56,34 +56,18 @@ class TextOperation extends EditOperation {
|
|||
|
||||
constructor() {
|
||||
super()
|
||||
|
||||
/**
|
||||
* When an operation is applied to an input string, you can think of this as
|
||||
* if an imaginary cursor runs over the entire string and skips over some
|
||||
* parts, removes some parts and inserts characters at some positions. These
|
||||
* actions (skip/remove/insert) are stored as an array in the "ops" property.
|
||||
* @type {ScanOp[]}
|
||||
*/
|
||||
// When an operation is applied to an input string, you can think of this as
|
||||
// if an imaginary cursor runs over the entire string and skips over some
|
||||
// parts, removes some parts and inserts characters at some positions. These
|
||||
// actions (skip/remove/insert) are stored as an array in the "ops" property.
|
||||
/** @type {ScanOp[]} */
|
||||
this.ops = []
|
||||
|
||||
/**
|
||||
* An operation's baseLength is the length of every string the operation
|
||||
* can be applied to.
|
||||
*/
|
||||
// An operation's baseLength is the length of every string the operation
|
||||
// can be applied to.
|
||||
this.baseLength = 0
|
||||
|
||||
/**
|
||||
* The targetLength is the length of every string that results from applying
|
||||
* the operation on a valid input string.
|
||||
*/
|
||||
// The targetLength is the length of every string that results from applying
|
||||
// the operation on a valid input string.
|
||||
this.targetLength = 0
|
||||
|
||||
/**
|
||||
* The expected content hash after this operation is applied
|
||||
*
|
||||
* @type {string | null}
|
||||
*/
|
||||
this.contentHash = null
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -239,12 +223,7 @@ class TextOperation extends EditOperation {
|
|||
* @returns {RawTextOperation}
|
||||
*/
|
||||
toJSON() {
|
||||
/** @type {RawTextOperation} */
|
||||
const json = { textOperation: this.ops.map(op => op.toJSON()) }
|
||||
if (this.contentHash != null) {
|
||||
json.contentHash = this.contentHash
|
||||
}
|
||||
return json
|
||||
return { textOperation: this.ops.map(op => op.toJSON()) }
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -252,7 +231,7 @@ class TextOperation extends EditOperation {
|
|||
* @param {RawTextOperation} obj
|
||||
* @returns {TextOperation}
|
||||
*/
|
||||
static fromJSON = function ({ textOperation: ops, contentHash }) {
|
||||
static fromJSON = function ({ textOperation: ops }) {
|
||||
const o = new TextOperation()
|
||||
for (const op of ops) {
|
||||
if (isRetain(op)) {
|
||||
|
@ -271,9 +250,6 @@ class TextOperation extends EditOperation {
|
|||
throw new UnprocessableError('unknown operation: ' + JSON.stringify(op))
|
||||
}
|
||||
}
|
||||
if (contentHash != null) {
|
||||
o.contentHash = contentHash
|
||||
}
|
||||
return o
|
||||
}
|
||||
|
||||
|
@ -314,18 +290,25 @@ class TextOperation extends EditOperation {
|
|||
str
|
||||
)
|
||||
}
|
||||
file.trackedChanges.applyRetain(result.length, op.length, {
|
||||
tracking: op.tracking,
|
||||
})
|
||||
result += str.slice(inputCursor, inputCursor + op.length)
|
||||
inputCursor += op.length
|
||||
} else if (op instanceof InsertOp) {
|
||||
if (containsNonBmpChars(op.insertion)) {
|
||||
throw new InvalidInsertionError(str, op.toJSON())
|
||||
}
|
||||
file.trackedChanges.applyInsert(result.length, op.insertion, {
|
||||
tracking: op.tracking,
|
||||
})
|
||||
file.comments.applyInsert(
|
||||
new Range(result.length, op.insertion.length),
|
||||
{ commentIds: op.commentIds }
|
||||
)
|
||||
result += op.insertion
|
||||
} else if (op instanceof RemoveOp) {
|
||||
file.trackedChanges.applyDelete(result.length, op.length)
|
||||
file.comments.applyDelete(new Range(result.length, op.length))
|
||||
inputCursor += op.length
|
||||
} else {
|
||||
|
@ -345,8 +328,6 @@ class TextOperation extends EditOperation {
|
|||
throw new TextOperation.TooLongError(operation, result.length)
|
||||
}
|
||||
|
||||
file.trackedChanges.applyTextOperation(this)
|
||||
|
||||
file.content = result
|
||||
}
|
||||
|
||||
|
@ -395,36 +376,44 @@ class TextOperation extends EditOperation {
|
|||
for (let i = 0, l = ops.length; i < l; i++) {
|
||||
const op = ops[i]
|
||||
if (op instanceof RetainOp) {
|
||||
if (op.tracking) {
|
||||
// Where we need to end up after the retains
|
||||
const target = strIndex + op.length
|
||||
// A previous retain could have overriden some tracking info. Now we
|
||||
// need to restore it.
|
||||
const previousChanges = previousState.trackedChanges.intersectRange(
|
||||
new Range(strIndex, op.length)
|
||||
)
|
||||
// Where we need to end up after the retains
|
||||
const target = strIndex + op.length
|
||||
// A previous retain could have overriden some tracking info. Now we
|
||||
// need to restore it.
|
||||
const previousRanges = previousState.trackedChanges.inRange(
|
||||
new Range(strIndex, op.length)
|
||||
)
|
||||
|
||||
for (const change of previousChanges) {
|
||||
if (strIndex < change.range.start) {
|
||||
inverse.retain(change.range.start - strIndex, {
|
||||
tracking: new ClearTrackingProps(),
|
||||
})
|
||||
strIndex = change.range.start
|
||||
}
|
||||
inverse.retain(change.range.length, {
|
||||
tracking: change.tracking,
|
||||
let removeTrackingInfoIfNeeded
|
||||
if (op.tracking) {
|
||||
removeTrackingInfoIfNeeded = new ClearTrackingProps()
|
||||
}
|
||||
|
||||
for (const trackedChange of previousRanges) {
|
||||
if (strIndex < trackedChange.range.start) {
|
||||
inverse.retain(trackedChange.range.start - strIndex, {
|
||||
tracking: removeTrackingInfoIfNeeded,
|
||||
})
|
||||
strIndex += change.range.length
|
||||
strIndex = trackedChange.range.start
|
||||
}
|
||||
if (strIndex < target) {
|
||||
inverse.retain(target - strIndex, {
|
||||
tracking: new ClearTrackingProps(),
|
||||
if (trackedChange.range.end < strIndex + op.length) {
|
||||
inverse.retain(trackedChange.range.length, {
|
||||
tracking: trackedChange.tracking,
|
||||
})
|
||||
strIndex = target
|
||||
strIndex = trackedChange.range.end
|
||||
}
|
||||
} else {
|
||||
inverse.retain(op.length)
|
||||
strIndex += op.length
|
||||
if (trackedChange.range.end !== strIndex) {
|
||||
// No need to split the range at the end
|
||||
const [left] = trackedChange.range.splitAt(strIndex)
|
||||
inverse.retain(left.length, { tracking: trackedChange.tracking })
|
||||
strIndex = left.end
|
||||
}
|
||||
}
|
||||
if (strIndex < target) {
|
||||
inverse.retain(target - strIndex, {
|
||||
tracking: removeTrackingInfoIfNeeded,
|
||||
})
|
||||
strIndex = target
|
||||
}
|
||||
} else if (op instanceof InsertOp) {
|
||||
inverse.remove(op.insertion.length)
|
||||
|
|
|
@ -86,32 +86,10 @@ class Range {
|
|||
}
|
||||
|
||||
/**
|
||||
* Does this range overlap another range?
|
||||
*
|
||||
* Overlapping means that the two ranges have at least one character in common
|
||||
*
|
||||
* @param {Range} other - the other range
|
||||
* @param {Range} range
|
||||
*/
|
||||
overlaps(other) {
|
||||
return this.start < other.end && this.end > other.start
|
||||
}
|
||||
|
||||
/**
|
||||
* Does this range overlap the start of another range?
|
||||
*
|
||||
* @param {Range} other - the other range
|
||||
*/
|
||||
overlapsStart(other) {
|
||||
return this.start <= other.start && this.end > other.start
|
||||
}
|
||||
|
||||
/**
|
||||
* Does this range overlap the end of another range?
|
||||
*
|
||||
* @param {Range} other - the other range
|
||||
*/
|
||||
overlapsEnd(other) {
|
||||
return this.start < other.end && this.end >= other.end
|
||||
overlaps(range) {
|
||||
return this.start < range.end && this.end > range.start
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -249,26 +227,6 @@ class Range {
|
|||
)
|
||||
return [rangeUpToCursor, rangeAfterCursor]
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the intersection of this range with another range
|
||||
*
|
||||
* @param {Range} other - the other range
|
||||
* @return {Range | null} the intersection or null if the intersection is empty
|
||||
*/
|
||||
intersect(other) {
|
||||
if (this.contains(other)) {
|
||||
return other
|
||||
} else if (other.contains(this)) {
|
||||
return this
|
||||
} else if (other.overlapsStart(this)) {
|
||||
return new Range(this.pos, other.end - this.start)
|
||||
} else if (other.overlapsEnd(this)) {
|
||||
return new Range(other.pos, this.end - other.start)
|
||||
} else {
|
||||
return null
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = Range
|
||||
|
|
|
@ -64,57 +64,17 @@ function cleanPart(filename) {
|
|||
* @return {String}
|
||||
*/
|
||||
exports.clean = function (pathname) {
|
||||
return exports.cleanDebug(pathname)[0]
|
||||
}
|
||||
|
||||
/**
|
||||
* See clean
|
||||
* @param {string} pathname
|
||||
* @return {[string,string]}
|
||||
*/
|
||||
exports.cleanDebug = function (pathname) {
|
||||
let prev = pathname
|
||||
let reason = ''
|
||||
|
||||
/**
|
||||
* @param {string} label
|
||||
*/
|
||||
function recordReasonIfChanged(label) {
|
||||
if (pathname === prev) return
|
||||
if (reason) reason += ','
|
||||
reason += label
|
||||
prev = pathname
|
||||
}
|
||||
pathname = path.normalize(pathname)
|
||||
recordReasonIfChanged('normalize')
|
||||
|
||||
pathname = pathname.replace(/\\/g, '/')
|
||||
recordReasonIfChanged('workaround for IE')
|
||||
|
||||
pathname = pathname.replace(/\/+/g, '/')
|
||||
recordReasonIfChanged('no multiple slashes')
|
||||
|
||||
pathname = pathname.replace(/^(\/.*)$/, '_$1')
|
||||
recordReasonIfChanged('no leading /')
|
||||
|
||||
pathname = pathname.replace(/^(.+)\/$/, '$1')
|
||||
recordReasonIfChanged('no trailing /')
|
||||
|
||||
pathname = pathname.replace(/^ *(.*)$/, '$1')
|
||||
recordReasonIfChanged('no leading spaces')
|
||||
|
||||
pathname = pathname.replace(/^(.*[^ ]) *$/, '$1')
|
||||
recordReasonIfChanged('no trailing spaces')
|
||||
|
||||
pathname = pathname.replace(/\\/g, '/') // workaround for IE
|
||||
pathname = pathname.replace(/\/+/g, '/') // no multiple slashes
|
||||
pathname = pathname.replace(/^(\/.*)$/, '_$1') // no leading /
|
||||
pathname = pathname.replace(/^(.+)\/$/, '$1') // no trailing /
|
||||
pathname = pathname.replace(/^ *(.*)$/, '$1') // no leading spaces
|
||||
pathname = pathname.replace(/^(.*[^ ]) *$/, '$1') // no trailing spaces
|
||||
if (pathname.length === 0) pathname = '_'
|
||||
recordReasonIfChanged('empty')
|
||||
|
||||
pathname = pathname.split('/').map(cleanPart).join('/')
|
||||
recordReasonIfChanged('cleanPart')
|
||||
|
||||
pathname = pathname.replace(BLOCKED_FILE_RX, '@$1')
|
||||
recordReasonIfChanged('BLOCKED_FILE_RX')
|
||||
return [pathname, reason]
|
||||
return pathname
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -124,19 +84,9 @@ exports.cleanDebug = function (pathname) {
|
|||
* @return {Boolean}
|
||||
*/
|
||||
exports.isClean = function pathnameIsClean(pathname) {
|
||||
return exports.isCleanDebug(pathname)[0]
|
||||
}
|
||||
|
||||
/**
|
||||
* A pathname is clean (see clean) and not too long.
|
||||
*
|
||||
* @param {string} pathname
|
||||
* @return {[boolean,string]}
|
||||
*/
|
||||
exports.isCleanDebug = function (pathname) {
|
||||
if (pathname.length > MAX_PATH) return [false, 'MAX_PATH']
|
||||
if (pathname.length === 0) return [false, 'empty']
|
||||
const [cleanPathname, reason] = exports.cleanDebug(pathname)
|
||||
if (cleanPathname !== pathname) return [false, reason]
|
||||
return [true, '']
|
||||
return (
|
||||
exports.clean(pathname) === pathname &&
|
||||
pathname.length <= MAX_PATH &&
|
||||
pathname.length > 0
|
||||
)
|
||||
}
|
||||
|
|
|
@ -224,7 +224,7 @@ class Snapshot {
|
|||
*
|
||||
* @param {string} kind see {File#load}
|
||||
* @param {ReadonlyBlobStore} blobStore
|
||||
* @return {Promise<Record<string, File>>} an object where keys are the pathnames and
|
||||
* @return {Promise<Object>} an object where keys are the pathnames and
|
||||
* values are the files in the snapshot
|
||||
*/
|
||||
async loadFiles(kind, blobStore) {
|
||||
|
|
|
@ -132,7 +132,6 @@ export type RawScanOp = RawInsertOp | RawRemoveOp | RawRetainOp
|
|||
|
||||
export type RawTextOperation = {
|
||||
textOperation: RawScanOp[]
|
||||
contentHash?: string
|
||||
}
|
||||
|
||||
export type RawAddCommentOperation = {
|
||||
|
|
|
@ -20,7 +20,7 @@
|
|||
"@types/check-types": "^7.3.7",
|
||||
"@types/path-browserify": "^1.0.2",
|
||||
"chai": "^3.3.0",
|
||||
"mocha": "^11.1.0",
|
||||
"mocha": "^10.2.0",
|
||||
"sinon": "^9.2.4",
|
||||
"typescript": "^5.0.4"
|
||||
},
|
||||
|
|
|
@ -193,13 +193,4 @@ describe('LazyStringFileData', function () {
|
|||
expect(fileData.getStringLength()).to.equal(longString.length)
|
||||
expect(fileData.getOperations()).to.have.length(1)
|
||||
})
|
||||
|
||||
it('truncates its operations after being stored', async function () {
|
||||
const testHash = File.EMPTY_FILE_HASH
|
||||
const fileData = new LazyStringFileData(testHash, undefined, 0)
|
||||
fileData.edit(new TextOperation().insert('abc'))
|
||||
const stored = await fileData.store(this.blobStore)
|
||||
expect(fileData.hash).to.equal(stored.hash)
|
||||
expect(fileData.operations).to.deep.equal([])
|
||||
})
|
||||
})
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
// @ts-check
|
||||
'use strict'
|
||||
|
||||
const { expect } = require('chai')
|
||||
|
@ -448,44 +449,4 @@ describe('Range', function () {
|
|||
expect(() => range.insertAt(16, 3)).to.throw()
|
||||
})
|
||||
})
|
||||
|
||||
describe('intersect', function () {
|
||||
it('should handle partially overlapping ranges', function () {
|
||||
const range1 = new Range(5, 10)
|
||||
const range2 = new Range(3, 6)
|
||||
const intersection1 = range1.intersect(range2)
|
||||
expect(intersection1.pos).to.equal(5)
|
||||
expect(intersection1.length).to.equal(4)
|
||||
const intersection2 = range2.intersect(range1)
|
||||
expect(intersection2.pos).to.equal(5)
|
||||
expect(intersection2.length).to.equal(4)
|
||||
})
|
||||
|
||||
it('should intersect with itself', function () {
|
||||
const range = new Range(5, 10)
|
||||
const intersection = range.intersect(range)
|
||||
expect(intersection.pos).to.equal(5)
|
||||
expect(intersection.length).to.equal(10)
|
||||
})
|
||||
|
||||
it('should handle nested ranges', function () {
|
||||
const range1 = new Range(5, 10)
|
||||
const range2 = new Range(7, 2)
|
||||
const intersection1 = range1.intersect(range2)
|
||||
expect(intersection1.pos).to.equal(7)
|
||||
expect(intersection1.length).to.equal(2)
|
||||
const intersection2 = range2.intersect(range1)
|
||||
expect(intersection2.pos).to.equal(7)
|
||||
expect(intersection2.length).to.equal(2)
|
||||
})
|
||||
|
||||
it('should handle disconnected ranges', function () {
|
||||
const range1 = new Range(5, 10)
|
||||
const range2 = new Range(20, 30)
|
||||
const intersection1 = range1.intersect(range2)
|
||||
expect(intersection1).to.be.null
|
||||
const intersection2 = range2.intersect(range1)
|
||||
expect(intersection2).to.be.null
|
||||
})
|
||||
})
|
||||
})
|
||||
|
|
|
@ -5,11 +5,10 @@ const ot = require('..')
|
|||
const safePathname = ot.safePathname
|
||||
|
||||
describe('safePathname', function () {
|
||||
function expectClean(input, output, reason = '') {
|
||||
function expectClean(input, output) {
|
||||
// check expected output and also idempotency
|
||||
const [cleanedInput, gotReason] = safePathname.cleanDebug(input)
|
||||
const cleanedInput = safePathname.clean(input)
|
||||
expect(cleanedInput).to.equal(output)
|
||||
expect(gotReason).to.equal(reason)
|
||||
expect(safePathname.clean(cleanedInput)).to.equal(cleanedInput)
|
||||
expect(safePathname.isClean(cleanedInput)).to.be.true
|
||||
}
|
||||
|
@ -23,56 +22,44 @@ describe('safePathname', function () {
|
|||
expect(safePathname.isClean('rm -rf /')).to.be.falsy
|
||||
|
||||
// replace invalid characters with underscores
|
||||
expectClean(
|
||||
'test-s*\u0001\u0002m\u0007st\u0008.jpg',
|
||||
'test-s___m_st_.jpg',
|
||||
'cleanPart'
|
||||
)
|
||||
expectClean('test-s*\u0001\u0002m\u0007st\u0008.jpg', 'test-s___m_st_.jpg')
|
||||
|
||||
// keep slashes, normalize paths, replace ..
|
||||
expectClean('./foo', 'foo', 'normalize')
|
||||
expectClean('../foo', '__/foo', 'cleanPart')
|
||||
expectClean('foo/./bar', 'foo/bar', 'normalize')
|
||||
expectClean('foo/../bar', 'bar', 'normalize')
|
||||
expectClean('../../tricky/foo.bar', '__/__/tricky/foo.bar', 'cleanPart')
|
||||
expectClean(
|
||||
'foo/../../tricky/foo.bar',
|
||||
'__/tricky/foo.bar',
|
||||
'normalize,cleanPart'
|
||||
)
|
||||
expectClean('foo/bar/../../tricky/foo.bar', 'tricky/foo.bar', 'normalize')
|
||||
expectClean(
|
||||
'foo/bar/baz/../../tricky/foo.bar',
|
||||
'foo/tricky/foo.bar',
|
||||
'normalize'
|
||||
)
|
||||
expectClean('./foo', 'foo')
|
||||
expectClean('../foo', '__/foo')
|
||||
expectClean('foo/./bar', 'foo/bar')
|
||||
expectClean('foo/../bar', 'bar')
|
||||
expectClean('../../tricky/foo.bar', '__/__/tricky/foo.bar')
|
||||
expectClean('foo/../../tricky/foo.bar', '__/tricky/foo.bar')
|
||||
expectClean('foo/bar/../../tricky/foo.bar', 'tricky/foo.bar')
|
||||
expectClean('foo/bar/baz/../../tricky/foo.bar', 'foo/tricky/foo.bar')
|
||||
|
||||
// remove illegal chars even when there is no extension
|
||||
expectClean('**foo', '__foo', 'cleanPart')
|
||||
expectClean('**foo', '__foo')
|
||||
|
||||
// remove windows file paths
|
||||
expectClean('c:\\temp\\foo.txt', 'c:/temp/foo.txt', 'workaround for IE')
|
||||
expectClean('c:\\temp\\foo.txt', 'c:/temp/foo.txt')
|
||||
|
||||
// do not allow a leading slash (relative paths only)
|
||||
expectClean('/foo', '_/foo', 'no leading /')
|
||||
expectClean('//foo', '_/foo', 'normalize,no leading /')
|
||||
expectClean('/foo', '_/foo')
|
||||
expectClean('//foo', '_/foo')
|
||||
|
||||
// do not allow multiple leading slashes
|
||||
expectClean('//foo', '_/foo', 'normalize,no leading /')
|
||||
expectClean('//foo', '_/foo')
|
||||
|
||||
// do not allow a trailing slash
|
||||
expectClean('/', '_', 'no leading /,no trailing /')
|
||||
expectClean('foo/', 'foo', 'no trailing /')
|
||||
expectClean('foo.tex/', 'foo.tex', 'no trailing /')
|
||||
expectClean('/', '_')
|
||||
expectClean('foo/', 'foo')
|
||||
expectClean('foo.tex/', 'foo.tex')
|
||||
|
||||
// do not allow multiple trailing slashes
|
||||
expectClean('//', '_', 'normalize,no leading /,no trailing /')
|
||||
expectClean('///', '_', 'normalize,no leading /,no trailing /')
|
||||
expectClean('foo//', 'foo', 'normalize,no trailing /')
|
||||
expectClean('//', '_')
|
||||
expectClean('///', '_')
|
||||
expectClean('foo//', 'foo')
|
||||
|
||||
// file and folder names that consist of . and .. are not OK
|
||||
expectClean('.', '_', 'cleanPart')
|
||||
expectClean('..', '__', 'cleanPart')
|
||||
expectClean('.', '_')
|
||||
expectClean('..', '__')
|
||||
// we will allow name with more dots e.g. ... and ....
|
||||
expectClean('...', '...')
|
||||
expectClean('....', '....')
|
||||
|
@ -95,10 +82,10 @@ describe('safePathname', function () {
|
|||
expectClean('a b.png', 'a b.png')
|
||||
|
||||
// leading and trailing spaces are not OK
|
||||
expectClean(' foo', 'foo', 'no leading spaces')
|
||||
expectClean(' foo', 'foo', 'no leading spaces')
|
||||
expectClean('foo ', 'foo', 'no trailing spaces')
|
||||
expectClean('foo ', 'foo', 'no trailing spaces')
|
||||
expectClean(' foo', 'foo')
|
||||
expectClean(' foo', 'foo')
|
||||
expectClean('foo ', 'foo')
|
||||
expectClean('foo ', 'foo')
|
||||
|
||||
// reserved file names on Windows should not be OK, but we already have
|
||||
// some in the old system, so have to allow them for now
|
||||
|
@ -113,14 +100,14 @@ describe('safePathname', function () {
|
|||
// there's no particular reason to allow multiple slashes; sometimes people
|
||||
// seem to rename files to URLs (https://domain/path) in an attempt to
|
||||
// upload a file, and this results in an empty directory name
|
||||
expectClean('foo//bar.png', 'foo/bar.png', 'normalize')
|
||||
expectClean('foo///bar.png', 'foo/bar.png', 'normalize')
|
||||
expectClean('foo//bar.png', 'foo/bar.png')
|
||||
expectClean('foo///bar.png', 'foo/bar.png')
|
||||
|
||||
// Check javascript property handling
|
||||
expectClean('foo/prototype', 'foo/prototype') // OK as part of a pathname
|
||||
expectClean('prototype/test.txt', 'prototype/test.txt')
|
||||
expectClean('prototype', '@prototype', 'BLOCKED_FILE_RX') // not OK as whole pathname
|
||||
expectClean('hasOwnProperty', '@hasOwnProperty', 'BLOCKED_FILE_RX')
|
||||
expectClean('**proto**', '@__proto__', 'cleanPart,BLOCKED_FILE_RX')
|
||||
expectClean('prototype', '@prototype') // not OK as whole pathname
|
||||
expectClean('hasOwnProperty', '@hasOwnProperty')
|
||||
expectClean('**proto**', '@__proto__')
|
||||
})
|
||||
})
|
||||
|
|
|
@ -107,7 +107,7 @@ describe('RetainOp', function () {
|
|||
expect(op1.equals(new RetainOp(3))).to.be.true
|
||||
})
|
||||
|
||||
it('cannot merge with another RetainOp if the tracking user is different', function () {
|
||||
it('cannot merge with another RetainOp if tracking info is different', function () {
|
||||
const op1 = new RetainOp(
|
||||
4,
|
||||
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
|
||||
|
@ -120,14 +120,14 @@ describe('RetainOp', function () {
|
|||
expect(() => op1.mergeWith(op2)).to.throw(Error)
|
||||
})
|
||||
|
||||
it('can merge with another RetainOp if the tracking user is the same', function () {
|
||||
it('can merge with another RetainOp if tracking info is the same', function () {
|
||||
const op1 = new RetainOp(
|
||||
4,
|
||||
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
|
||||
)
|
||||
const op2 = new RetainOp(
|
||||
4,
|
||||
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:01.000Z'))
|
||||
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
|
||||
)
|
||||
op1.mergeWith(op2)
|
||||
expect(
|
||||
|
@ -310,7 +310,7 @@ describe('InsertOp', function () {
|
|||
expect(() => op1.mergeWith(op2)).to.throw(Error)
|
||||
})
|
||||
|
||||
it('cannot merge with another InsertOp if tracking user is different', function () {
|
||||
it('cannot merge with another InsertOp if tracking info is different', function () {
|
||||
const op1 = new InsertOp(
|
||||
'a',
|
||||
new TrackingProps('insert', 'user1', new Date('2024-01-01T00:00:00.000Z'))
|
||||
|
@ -323,7 +323,7 @@ describe('InsertOp', function () {
|
|||
expect(() => op1.mergeWith(op2)).to.throw(Error)
|
||||
})
|
||||
|
||||
it('can merge with another InsertOp if tracking user and comment info is the same', function () {
|
||||
it('can merge with another InsertOp if tracking and comment info is the same', function () {
|
||||
const op1 = new InsertOp(
|
||||
'a',
|
||||
new TrackingProps(
|
||||
|
@ -338,7 +338,7 @@ describe('InsertOp', function () {
|
|||
new TrackingProps(
|
||||
'insert',
|
||||
'user1',
|
||||
new Date('2024-01-01T00:00:01.000Z')
|
||||
new Date('2024-01-01T00:00:00.000Z')
|
||||
),
|
||||
['1', '2']
|
||||
)
|
||||
|
|
|
@ -322,47 +322,6 @@ describe('TextOperation', function () {
|
|||
new TextOperation().retain(4).remove(4).retain(3)
|
||||
)
|
||||
})
|
||||
|
||||
it('undoing a tracked delete restores the tracked changes', function () {
|
||||
expectInverseToLeadToInitialState(
|
||||
new StringFileData(
|
||||
'the quick brown fox jumps over the lazy dog',
|
||||
undefined,
|
||||
[
|
||||
{
|
||||
range: { pos: 5, length: 5 },
|
||||
tracking: {
|
||||
ts: '2023-01-01T00:00:00.000Z',
|
||||
type: 'insert',
|
||||
userId: 'user1',
|
||||
},
|
||||
},
|
||||
{
|
||||
range: { pos: 12, length: 3 },
|
||||
tracking: {
|
||||
ts: '2023-01-01T00:00:00.000Z',
|
||||
type: 'delete',
|
||||
userId: 'user1',
|
||||
},
|
||||
},
|
||||
{
|
||||
range: { pos: 18, length: 5 },
|
||||
tracking: {
|
||||
ts: '2023-01-01T00:00:00.000Z',
|
||||
type: 'insert',
|
||||
userId: 'user1',
|
||||
},
|
||||
},
|
||||
]
|
||||
),
|
||||
new TextOperation()
|
||||
.retain(7)
|
||||
.retain(13, {
|
||||
tracking: new TrackingProps('delete', 'user1', new Date()),
|
||||
})
|
||||
.retain(23)
|
||||
)
|
||||
})
|
||||
})
|
||||
|
||||
describe('compose', function () {
|
||||
|
|
1
libraries/promise-utils/.dockerignore
Normal file
1
libraries/promise-utils/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
3
libraries/promise-utils/.gitignore
vendored
Normal file
3
libraries/promise-utils/.gitignore
vendored
Normal file
|
@ -0,0 +1,3 @@
|
|||
|
||||
# managed by monorepo$ bin/update_build_scripts
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
promise-utils
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -13,7 +13,6 @@ module.exports = {
|
|||
expressify,
|
||||
expressifyErrorHandler,
|
||||
promiseMapWithLimit,
|
||||
promiseMapSettledWithLimit,
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -265,19 +264,3 @@ async function promiseMapWithLimit(concurrency, array, fn) {
|
|||
const limit = pLimit(concurrency)
|
||||
return await Promise.all(array.map(x => limit(() => fn(x))))
|
||||
}
|
||||
|
||||
/**
|
||||
* Map values in `array` with the async function `fn`
|
||||
*
|
||||
* Limit the number of unresolved promises to `concurrency`.
|
||||
*
|
||||
* @template T, U
|
||||
* @param {number} concurrency
|
||||
* @param {Array<T>} array
|
||||
* @param {(T) => Promise<U>} fn
|
||||
* @return {Promise<Array<PromiseSettledResult<U>>>}
|
||||
*/
|
||||
function promiseMapSettledWithLimit(concurrency, array, fn) {
|
||||
const limit = pLimit(concurrency)
|
||||
return Promise.allSettled(array.map(x => limit(() => fn(x))))
|
||||
}
|
||||
|
|
|
@ -18,7 +18,7 @@
|
|||
"devDependencies": {
|
||||
"chai": "^4.3.10",
|
||||
"chai-as-promised": "^7.1.1",
|
||||
"mocha": "^11.1.0",
|
||||
"mocha": "^10.2.0",
|
||||
"typescript": "^5.0.4"
|
||||
},
|
||||
"dependencies": {
|
||||
|
|
1
libraries/ranges-tracker/.dockerignore
Normal file
1
libraries/ranges-tracker/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
13
libraries/ranges-tracker/.gitignore
vendored
Normal file
13
libraries/ranges-tracker/.gitignore
vendored
Normal file
|
@ -0,0 +1,13 @@
|
|||
**.swp
|
||||
|
||||
app.js
|
||||
app/js/
|
||||
test/unit/js/
|
||||
public/build/
|
||||
|
||||
node_modules/
|
||||
|
||||
/public/js/chat.js
|
||||
plato/
|
||||
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
ranges-tracker
|
||||
--dependencies=None
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--docker-repos=gcr.io/overleaf-ops
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=False
|
||||
--is-library=True
|
||||
--node-version=22.17.0
|
||||
--node-version=20.18.0
|
||||
--public-repo=False
|
||||
--script-version=4.7.0
|
||||
--script-version=4.5.0
|
||||
|
|
|
@ -145,7 +145,11 @@ class RangesTracker {
|
|||
}
|
||||
|
||||
removeChangeId(changeId) {
|
||||
this.removeChangeIds([changeId])
|
||||
const change = this.getChange(changeId)
|
||||
if (change == null) {
|
||||
return
|
||||
}
|
||||
this._removeChange(change)
|
||||
}
|
||||
|
||||
removeChangeIds(ids) {
|
||||
|
@ -312,7 +316,7 @@ class RangesTracker {
|
|||
const movedChanges = []
|
||||
const removeChanges = []
|
||||
const newChanges = []
|
||||
const trackedDeletesAtOpPosition = []
|
||||
|
||||
for (let i = 0; i < this.changes.length; i++) {
|
||||
change = this.changes[i]
|
||||
const changeStart = change.op.p
|
||||
|
@ -323,15 +327,13 @@ class RangesTracker {
|
|||
change.op.p += opLength
|
||||
movedChanges.push(change)
|
||||
} else if (opStart === changeStart) {
|
||||
// If we are undoing, then we want to cancel any existing delete ranges if we can.
|
||||
// Check if the insert matches the start of the delete, and just remove it from the delete instead if so.
|
||||
if (
|
||||
!alreadyMerged &&
|
||||
undoing &&
|
||||
change.op.d.length >= op.i.length &&
|
||||
change.op.d.slice(0, op.i.length) === op.i
|
||||
) {
|
||||
// If we are undoing, then we want to reject any existing tracked delete if we can.
|
||||
// Check if the insert matches the start of the delete, and just
|
||||
// remove it from the delete instead if so.
|
||||
change.op.d = change.op.d.slice(op.i.length)
|
||||
change.op.p += op.i.length
|
||||
if (change.op.d === '') {
|
||||
|
@ -340,25 +342,9 @@ class RangesTracker {
|
|||
movedChanges.push(change)
|
||||
}
|
||||
alreadyMerged = true
|
||||
|
||||
// Any tracked delete that came before this tracked delete
|
||||
// rejection was moved after the incoming insert. Move them back
|
||||
// so that they appear before the tracked delete rejection.
|
||||
for (const trackedDelete of trackedDeletesAtOpPosition) {
|
||||
trackedDelete.op.p -= opLength
|
||||
}
|
||||
} else {
|
||||
// We're not rejecting that tracked delete. Move it after the
|
||||
// insert.
|
||||
change.op.p += opLength
|
||||
movedChanges.push(change)
|
||||
|
||||
// Keep track of tracked deletes that are at the same position as the
|
||||
// insert. If we find a tracked delete to reject, we'll want to
|
||||
// reposition them.
|
||||
if (!alreadyMerged) {
|
||||
trackedDeletesAtOpPosition.push(change)
|
||||
}
|
||||
}
|
||||
}
|
||||
} else if (change.op.i != null) {
|
||||
|
@ -638,11 +624,9 @@ class RangesTracker {
|
|||
}
|
||||
|
||||
_addOp(op, metadata) {
|
||||
// Don't take a reference to the existing op since we'll modify this in place with future changes
|
||||
op = this._clone(op)
|
||||
const change = {
|
||||
id: this.newId(),
|
||||
op,
|
||||
op: this._clone(op), // Don't take a reference to the existing op since we'll modify this in place with future changes
|
||||
metadata: this._clone(metadata),
|
||||
}
|
||||
this.changes.push(change)
|
||||
|
@ -665,7 +649,7 @@ class RangesTracker {
|
|||
}
|
||||
|
||||
_removeChange(change) {
|
||||
this.changes = this.changes.filter(c => c !== change)
|
||||
this.changes = this.changes.filter(c => c.id !== change.id)
|
||||
this._markAsDirty(change, 'change', 'removed')
|
||||
}
|
||||
|
||||
|
|
|
@ -20,7 +20,7 @@
|
|||
},
|
||||
"devDependencies": {
|
||||
"chai": "^4.3.6",
|
||||
"mocha": "^11.1.0",
|
||||
"mocha": "^10.2.0",
|
||||
"typescript": "^5.0.4"
|
||||
}
|
||||
}
|
||||
|
|
|
@ -4,7 +4,6 @@ const RangesTracker = require('../..')
|
|||
describe('RangesTracker', function () {
|
||||
describe('with duplicate change ids', function () {
|
||||
beforeEach(function () {
|
||||
this.comments = []
|
||||
this.changes = [
|
||||
{ id: 'id1', op: { p: 1, i: 'hello' } },
|
||||
{ id: 'id2', op: { p: 10, i: 'world' } },
|
||||
|
@ -27,199 +26,4 @@ describe('RangesTracker', function () {
|
|||
expect(this.rangesTracker.changes).to.deep.equal([this.changes[2]])
|
||||
})
|
||||
})
|
||||
|
||||
describe('with duplicate tracked insert ids', function () {
|
||||
beforeEach(function () {
|
||||
this.comments = []
|
||||
this.changes = [
|
||||
{ id: 'id1', op: { p: 10, i: 'one' } },
|
||||
{ id: 'id1', op: { p: 20, i: 'two' } },
|
||||
{ id: 'id1', op: { p: 30, d: 'three' } },
|
||||
]
|
||||
this.rangesTracker = new RangesTracker(this.changes, this.comments)
|
||||
})
|
||||
|
||||
it("deleting one tracked insert doesn't delete the others", function () {
|
||||
this.rangesTracker.applyOp({ p: 20, d: 'two' })
|
||||
expect(this.rangesTracker.changes).to.deep.equal([
|
||||
this.changes[0],
|
||||
this.changes[2],
|
||||
])
|
||||
})
|
||||
})
|
||||
|
||||
describe('with duplicate tracked delete ids', function () {
|
||||
beforeEach(function () {
|
||||
this.comments = []
|
||||
this.changes = [
|
||||
{ id: 'id1', op: { p: 10, d: 'one' } },
|
||||
{ id: 'id1', op: { p: 20, d: 'two' } },
|
||||
{ id: 'id1', op: { p: 30, d: 'three' } },
|
||||
]
|
||||
this.rangesTracker = new RangesTracker(this.changes, this.comments)
|
||||
})
|
||||
|
||||
it('deleting over tracked deletes in tracked changes mode removes the tracked deletes covered', function () {
|
||||
this.rangesTracker.track_changes = true
|
||||
this.rangesTracker.applyOp({
|
||||
p: 15,
|
||||
d: '567890123456789012345',
|
||||
})
|
||||
expect(this.rangesTracker.changes.map(c => c.op)).to.deep.equal([
|
||||
{ p: 10, d: 'one' },
|
||||
{ p: 15, d: '56789two0123456789three012345' },
|
||||
])
|
||||
})
|
||||
|
||||
it('a tracked delete between two tracked deletes joins them into a single tracked delete', function () {
|
||||
this.rangesTracker.track_changes = true
|
||||
this.rangesTracker.applyOp({
|
||||
p: 20,
|
||||
d: '0123456789',
|
||||
})
|
||||
expect(this.rangesTracker.changes.map(c => c.op)).to.deep.equal([
|
||||
{ p: 10, d: 'one' },
|
||||
{ p: 20, d: 'two0123456789three' },
|
||||
])
|
||||
})
|
||||
|
||||
it("rejecting one tracked delete doesn't reject the others", function () {
|
||||
this.rangesTracker.track_changes = true
|
||||
this.rangesTracker.applyOp({
|
||||
p: 20,
|
||||
i: 'two',
|
||||
u: true,
|
||||
})
|
||||
expect(this.rangesTracker.changes.map(c => c.op)).to.deep.equal([
|
||||
{ p: 10, d: 'one' },
|
||||
{ p: 33, d: 'three' },
|
||||
])
|
||||
})
|
||||
|
||||
it("rejecting all tracked deletes doesn't introduce tracked inserts", function () {
|
||||
this.rangesTracker.track_changes = true
|
||||
this.rangesTracker.applyOp({
|
||||
p: 10,
|
||||
i: 'one',
|
||||
u: true,
|
||||
})
|
||||
this.rangesTracker.applyOp({
|
||||
p: 23,
|
||||
i: 'two',
|
||||
u: true,
|
||||
})
|
||||
this.rangesTracker.applyOp({
|
||||
p: 36,
|
||||
i: 'three',
|
||||
u: true,
|
||||
})
|
||||
expect(this.rangesTracker.changes.map(c => c.op)).to.deep.equal([])
|
||||
})
|
||||
})
|
||||
|
||||
describe('with multiple tracked deletes at the same position', function () {
|
||||
beforeEach(function () {
|
||||
this.comments = []
|
||||
this.changes = [
|
||||
{ id: 'id1', op: { p: 33, d: 'before' } },
|
||||
{ id: 'id2', op: { p: 50, d: 'right before' } },
|
||||
{ id: 'id3', op: { p: 50, d: 'this one' } },
|
||||
{ id: 'id4', op: { p: 50, d: 'right after' } },
|
||||
{ id: 'id5', op: { p: 75, d: 'long after' } },
|
||||
]
|
||||
this.rangesTracker = new RangesTracker(this.changes, this.comments)
|
||||
})
|
||||
|
||||
it('preserves the text order when rejecting changes', function () {
|
||||
this.rangesTracker.applyOp(
|
||||
{ p: 50, i: 'this one', u: true },
|
||||
{ user_id: 'user-id' }
|
||||
)
|
||||
expect(this.rangesTracker.changes).to.deep.equal([
|
||||
{ id: 'id1', op: { p: 33, d: 'before' } },
|
||||
{ id: 'id2', op: { p: 50, d: 'right before' } },
|
||||
{ id: 'id4', op: { p: 58, d: 'right after' } },
|
||||
{ id: 'id5', op: { p: 83, d: 'long after' } },
|
||||
])
|
||||
})
|
||||
|
||||
it('moves all tracked deletes after the insert if not rejecting changes', function () {
|
||||
this.rangesTracker.applyOp(
|
||||
{ p: 50, i: 'some other text', u: true, orderedRejections: true },
|
||||
{ user_id: 'user-id' }
|
||||
)
|
||||
expect(this.rangesTracker.changes).to.deep.equal([
|
||||
{ id: 'id1', op: { p: 33, d: 'before' } },
|
||||
{ id: 'id2', op: { p: 65, d: 'right before' } },
|
||||
{ id: 'id3', op: { p: 65, d: 'this one' } },
|
||||
{ id: 'id4', op: { p: 65, d: 'right after' } },
|
||||
{ id: 'id5', op: { p: 90, d: 'long after' } },
|
||||
])
|
||||
})
|
||||
})
|
||||
|
||||
describe('with multiple tracked deletes at the same position with the same content', function () {
|
||||
beforeEach(function () {
|
||||
this.comments = []
|
||||
this.changes = [
|
||||
{ id: 'id1', op: { p: 10, d: 'cat' } },
|
||||
{ id: 'id2', op: { p: 10, d: 'giraffe' } },
|
||||
{ id: 'id3', op: { p: 10, d: 'cat' } },
|
||||
{ id: 'id4', op: { p: 10, d: 'giraffe' } },
|
||||
]
|
||||
this.rangesTracker = new RangesTracker(this.changes, this.comments)
|
||||
})
|
||||
|
||||
it('removes only the first matching tracked delete', function () {
|
||||
this.rangesTracker.applyOp(
|
||||
{ p: 10, i: 'giraffe', u: true },
|
||||
{ user_id: 'user-id' }
|
||||
)
|
||||
expect(this.rangesTracker.changes).to.deep.equal([
|
||||
{ id: 'id1', op: { p: 10, d: 'cat' } },
|
||||
{ id: 'id3', op: { p: 17, d: 'cat' } },
|
||||
{ id: 'id4', op: { p: 17, d: 'giraffe' } },
|
||||
])
|
||||
})
|
||||
})
|
||||
|
||||
describe('with a tracked insert at the same position as a tracked delete', function () {
|
||||
beforeEach(function () {
|
||||
this.comments = []
|
||||
this.changes = [
|
||||
{
|
||||
id: 'id1',
|
||||
op: { p: 5, d: 'before' },
|
||||
metadata: { user_id: 'user-id' },
|
||||
},
|
||||
{
|
||||
id: 'id2',
|
||||
op: { p: 10, d: 'delete' },
|
||||
metadata: { user_id: 'user-id' },
|
||||
},
|
||||
{
|
||||
id: 'id3',
|
||||
op: { p: 10, i: 'insert' },
|
||||
metadata: { user_id: 'user-id' },
|
||||
},
|
||||
]
|
||||
this.rangesTracker = new RangesTracker(this.changes, this.comments)
|
||||
})
|
||||
|
||||
it('places a tracked insert at the same position before both the delete and the insert', function () {
|
||||
this.rangesTracker.track_changes = true
|
||||
this.rangesTracker.applyOp(
|
||||
{ p: 10, i: 'incoming' },
|
||||
{ user_id: 'user-id' }
|
||||
)
|
||||
expect(this.rangesTracker.changes.map(change => change.op)).to.deep.equal(
|
||||
[
|
||||
{ p: 5, d: 'before' },
|
||||
{ p: 10, i: 'incoming' },
|
||||
{ p: 18, d: 'delete' },
|
||||
{ p: 18, i: 'insert' },
|
||||
]
|
||||
)
|
||||
})
|
||||
})
|
||||
})
|
||||
|
|
1
libraries/redis-wrapper/.dockerignore
Normal file
1
libraries/redis-wrapper/.dockerignore
Normal file
|
@ -0,0 +1 @@
|
|||
node_modules/
|
13
libraries/redis-wrapper/.gitignore
vendored
Normal file
13
libraries/redis-wrapper/.gitignore
vendored
Normal file
|
@ -0,0 +1,13 @@
|
|||
**.swp
|
||||
|
||||
app.js
|
||||
app/js/
|
||||
test/unit/js/
|
||||
public/build/
|
||||
|
||||
node_modules/
|
||||
|
||||
/public/js/chat.js
|
||||
plato/
|
||||
|
||||
.npmrc
|
|
@ -1 +1 @@
|
|||
22.17.0
|
||||
20.18.0
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue