mirror of
https://github.com/yu-i-i/overleaf-cep.git
synced 2025-07-23 05:00:07 +02:00
Compare commits
67 commits
c1c98cd3f5
...
3461c105f4
Author | SHA1 | Date | |
---|---|---|---|
![]() |
3461c105f4 | ||
![]() |
bfc75552a5 | ||
![]() |
0d3890e2ae | ||
![]() |
a66cfc15ec | ||
![]() |
9f8136d13a | ||
![]() |
5e527b9a61 | ||
![]() |
88283f054d | ||
![]() |
e9efde94b7 | ||
![]() |
e8813f89cb | ||
![]() |
07a66fe94c | ||
![]() |
af636b8940 | ||
![]() |
08d22264c6 | ||
53f532fa29 | |||
![]() |
94c0234284 | ||
![]() |
176a1a4f96 | ||
![]() |
3ce4381768 | ||
![]() |
7de6dffb3c | ||
![]() |
dce4c64534 | ||
![]() |
8af4a2996a | ||
![]() |
73ff0a0eee | ||
![]() |
571735fd8f | ||
![]() |
2253ec577e | ||
![]() |
88486fa491 | ||
![]() |
26521730ef | ||
![]() |
7bb14e7e9d | ||
![]() |
9e0792f665 | ||
![]() |
8a20b2b5a1 | ||
![]() |
ae99d681bb | ||
![]() |
b035613237 | ||
![]() |
2a0d304b70 | ||
![]() |
da8bcee3c1 | ||
![]() |
f38be34f46 | ||
![]() |
d86d6519e6 | ||
![]() |
1e3e9a4096 | ||
![]() |
d0b38798d8 | ||
![]() |
a31f70f4db | ||
![]() |
bfecca5eb3 | ||
![]() |
b6f4eaf1df | ||
![]() |
747b021030 | ||
![]() |
db3f0d08dc | ||
![]() |
586afb3e70 | ||
![]() |
f90b086f32 | ||
![]() |
29ed51f81b | ||
![]() |
41d0404df4 | ||
![]() |
beff3fdb07 | ||
![]() |
f5859e373f | ||
![]() |
f1e9b0645c | ||
![]() |
47cefe1c45 | ||
![]() |
c15930080c | ||
![]() |
3b3fc01308 | ||
![]() |
5d3d056af2 | ||
![]() |
010192506b | ||
![]() |
9821e64994 | ||
![]() |
3eb637d999 | ||
![]() |
0546fb7233 | ||
![]() |
b1880ba64d | ||
![]() |
082121d3da | ||
![]() |
81f0807fc6 | ||
![]() |
bf43d4f709 | ||
![]() |
ae3f63d37f | ||
![]() |
30b0cabbbc | ||
![]() |
2f427ef0e0 | ||
![]() |
0778bab910 | ||
![]() |
d5b5710d01 | ||
![]() |
868d562d96 | ||
![]() |
5d79cf18c0 | ||
![]() |
7ecee2e0aa |
232 changed files with 10859 additions and 686 deletions
60
README.md
60
README.md
|
@ -14,44 +14,52 @@
|
|||
<a href="#license">License</a>
|
||||
</p>
|
||||
|
||||
<img src="doc/screenshot.png" alt="A screenshot of a project being edited in Overleaf Community Edition">
|
||||
<img src="doc/screenshot.png" alt="A screenshot of a project being edited in Overleaf Extended Community Edition">
|
||||
<p align="center">
|
||||
Figure 1: A screenshot of a project being edited in Overleaf Community Edition.
|
||||
Figure 1: A screenshot of a project being edited in Overleaf Extended Community Edition.
|
||||
</p>
|
||||
|
||||
## Community Edition
|
||||
|
||||
[Overleaf](https://www.overleaf.com) is an open-source online real-time collaborative LaTeX editor. We run a hosted version at [www.overleaf.com](https://www.overleaf.com), but you can also run your own local version, and contribute to the development of Overleaf.
|
||||
[Overleaf](https://www.overleaf.com) is an open-source online real-time collaborative LaTeX editor. Overleaf runs a hosted version at [www.overleaf.com](https://www.overleaf.com), but you can also run your own local version, and contribute to the development of Overleaf.
|
||||
|
||||
## Extended Community Edition
|
||||
|
||||
The present "extended" version of Overleaf CE includes:
|
||||
|
||||
- Template Gallery
|
||||
- Sandboxed Compiles with TeX Live image selection
|
||||
- LDAP authentication
|
||||
- SAML authentication
|
||||
- OpenID Connect authentication
|
||||
- Real-time track changes and comments
|
||||
- Autocomplete of reference keys
|
||||
- Symbol Palette
|
||||
- "From External URL" feature
|
||||
|
||||
> [!CAUTION]
|
||||
> Overleaf Community Edition is intended for use in environments where **all** users are trusted. Community Edition is **not** appropriate for scenarios where isolation of users is required due to Sandbox Compiles not being available. When not using Sandboxed Compiles, users have full read and write access to the `sharelatex` container resources (filesystem, network, environment variables) when running LaTeX compiles.
|
||||
> Overleaf Community Edition is intended for use in environments where **all** users are trusted. Community Edition is **not** appropriate for scenarios where isolation of users is required due to Sandbox Compiles not being available. When not using Sandboxed Compiles, users have full read and write access to the `sharelatex` container resources (filesystem, network, environment variables) when running LaTeX compiles.
|
||||
Therefore, in any environment where not all users can be fully trusted, it is strongly recommended to enable the Sandboxed Compiles feature available in the Extended Community Edition.
|
||||
|
||||
For more information on Sandbox Compiles check out our [documentation](https://docs.overleaf.com/on-premises/configuration/overleaf-toolkit/server-pro-only-configuration/sandboxed-compiles).
|
||||
For more information on Sandbox Compiles check out Overleaf [documentation](https://docs.overleaf.com/on-premises/configuration/overleaf-toolkit/server-pro-only-configuration/sandboxed-compiles).
|
||||
|
||||
## Enterprise
|
||||
|
||||
If you want help installing and maintaining Overleaf in your lab or workplace, we offer an officially supported version called [Overleaf Server Pro](https://www.overleaf.com/for/enterprises). It also includes more features for security (SSO with LDAP or SAML), administration and collaboration (e.g. tracked changes). [Find out more!](https://www.overleaf.com/for/enterprises)
|
||||
|
||||
## Keeping up to date
|
||||
|
||||
Sign up to the [mailing list](https://mailchi.mp/overleaf.com/community-edition-and-server-pro) to get updates on Overleaf releases and development.
|
||||
If you want help installing and maintaining Overleaf in your lab or workplace, Overleaf offers an officially supported version called [Overleaf Server Pro](https://www.overleaf.com/for/enterprises).
|
||||
|
||||
## Installation
|
||||
|
||||
We have detailed installation instructions in the [Overleaf Toolkit](https://github.com/overleaf/toolkit/).
|
||||
|
||||
## Upgrading
|
||||
|
||||
If you are upgrading from a previous version of Overleaf, please see the [Release Notes section on the Wiki](https://github.com/overleaf/overleaf/wiki#release-notes) for all of the versions between your current version and the version you are upgrading to.
|
||||
Detailed installation instructions can be found in the [Overleaf Toolkit](https://github.com/overleaf/toolkit/).
|
||||
Configuration details and release history for the Extended Community Edition can be found on the [Extended CE Wiki Page](https://github.com/yu-i-i/overleaf-cep/wiki).
|
||||
|
||||
## Overleaf Docker Image
|
||||
|
||||
This repo contains two dockerfiles, [`Dockerfile-base`](server-ce/Dockerfile-base), which builds the
|
||||
`sharelatex/sharelatex-base` image, and [`Dockerfile`](server-ce/Dockerfile) which builds the
|
||||
`sharelatex/sharelatex` (or "community") image.
|
||||
`sharelatex/sharelatex-base:ext-ce` image, and [`Dockerfile`](server-ce/Dockerfile) which builds the
|
||||
`sharelatex/sharelatex:ext-ce` image.
|
||||
|
||||
The Base image generally contains the basic dependencies like `wget`, plus `texlive`.
|
||||
We split this out because it's a pretty heavy set of
|
||||
This is split out because it's a pretty heavy set of
|
||||
dependencies, and it's nice to not have to rebuild all of that every time.
|
||||
|
||||
The `sharelatex/sharelatex` image extends the base image and adds the actual Overleaf code
|
||||
|
@ -59,20 +67,16 @@ and services.
|
|||
|
||||
Use `make build-base` and `make build-community` from `server-ce/` to build these images.
|
||||
|
||||
We use the [Phusion base-image](https://github.com/phusion/baseimage-docker)
|
||||
(which is extended by our `base` image) to provide us with a VM-like container
|
||||
The [Phusion base-image](https://github.com/phusion/baseimage-docker)
|
||||
(which is extended by the `base` image) provides a VM-like container
|
||||
in which to run the Overleaf services. Baseimage uses the `runit` service
|
||||
manager to manage services, and we add our init-scripts from the `server-ce/runit`
|
||||
folder.
|
||||
|
||||
|
||||
## Contributing
|
||||
|
||||
Please see the [CONTRIBUTING](CONTRIBUTING.md) file for information on contributing to the development of Overleaf.
|
||||
manager to manage services, and init scripts from the `server-ce/runit`
|
||||
folder are added.
|
||||
|
||||
## Authors
|
||||
|
||||
[The Overleaf Team](https://www.overleaf.com/about)
|
||||
[The Overleaf Team](https://www.overleaf.com/about)
|
||||
[yu-i-i](https://github.com/yu-i-i/overleaf-cep) — Extensions for CE unless otherwise noted
|
||||
|
||||
## License
|
||||
|
||||
|
|
|
@ -77,6 +77,7 @@ each service:
|
|||
| `filestore` | 9235 |
|
||||
| `notifications` | 9236 |
|
||||
| `real-time` | 9237 |
|
||||
| `references` | 9238 |
|
||||
| `history-v1` | 9239 |
|
||||
| `project-history` | 9240 |
|
||||
|
||||
|
|
|
@ -15,6 +15,7 @@ PROJECT_HISTORY_HOST=project-history
|
|||
QUEUES_REDIS_HOST=redis
|
||||
REALTIME_HOST=real-time
|
||||
REDIS_HOST=redis
|
||||
REFERENCES_HOST=references
|
||||
SESSION_SECRET=foo
|
||||
V1_HISTORY_HOST=history-v1
|
||||
WEBPACK_HOST=webpack
|
||||
|
|
|
@ -112,6 +112,17 @@ services:
|
|||
- ../services/real-time/app.js:/overleaf/services/real-time/app.js
|
||||
- ../services/real-time/config:/overleaf/services/real-time/config
|
||||
|
||||
references:
|
||||
command: ["node", "--watch", "app.js"]
|
||||
environment:
|
||||
- NODE_OPTIONS=--inspect=0.0.0.0:9229
|
||||
ports:
|
||||
- "127.0.0.1:9238:9229"
|
||||
volumes:
|
||||
- ../services/references/app:/overleaf/services/references/app
|
||||
- ../services/references/config:/overleaf/services/references/config
|
||||
- ../services/references/app.js:/overleaf/services/references/app.js
|
||||
|
||||
web:
|
||||
command: ["node", "--watch", "app.mjs", "--watch-locales"]
|
||||
environment:
|
||||
|
|
|
@ -123,7 +123,7 @@ services:
|
|||
dockerfile: services/real-time/Dockerfile
|
||||
env_file:
|
||||
- dev.env
|
||||
|
||||
|
||||
redis:
|
||||
image: redis:5
|
||||
ports:
|
||||
|
@ -131,6 +131,13 @@ services:
|
|||
volumes:
|
||||
- redis-data:/data
|
||||
|
||||
references:
|
||||
build:
|
||||
context: ..
|
||||
dockerfile: services/references/Dockerfile
|
||||
env_file:
|
||||
- dev.env
|
||||
|
||||
web:
|
||||
build:
|
||||
context: ..
|
||||
|
@ -140,7 +147,7 @@ services:
|
|||
- dev.env
|
||||
environment:
|
||||
- APP_NAME=Overleaf Community Edition
|
||||
- ENABLED_LINKED_FILE_TYPES=project_file,project_output_file
|
||||
- ENABLED_LINKED_FILE_TYPES=project_file,project_output_file,url
|
||||
- EMAIL_CONFIRMATION_DISABLED=true
|
||||
- NODE_ENV=development
|
||||
- OVERLEAF_ALLOW_PUBLIC_ACCESS=true
|
||||
|
@ -161,6 +168,7 @@ services:
|
|||
- notifications
|
||||
- project-history
|
||||
- real-time
|
||||
- references
|
||||
|
||||
webpack:
|
||||
build:
|
||||
|
|
Binary file not shown.
Before Width: | Height: | Size: 587 KiB After Width: | Height: | Size: 1 MiB |
|
@ -32,7 +32,7 @@ services:
|
|||
OVERLEAF_REDIS_HOST: redis
|
||||
REDIS_HOST: redis
|
||||
|
||||
ENABLED_LINKED_FILE_TYPES: 'project_file,project_output_file'
|
||||
ENABLED_LINKED_FILE_TYPES: 'project_file,project_output_file,url'
|
||||
|
||||
# Enables Thumbnail generation using ImageMagick
|
||||
ENABLE_CONVERSIONS: 'true'
|
||||
|
|
31
package-lock.json
generated
31
package-lock.json
generated
|
@ -35581,6 +35581,7 @@
|
|||
"resolved": "https://registry.npmjs.org/request/-/request-2.88.2.tgz",
|
||||
"integrity": "sha512-MsvtOrfG9ZcrOwAW+Qi+F6HbD0CWXEh9ou77uOb7FM2WPhwT7smM833PzanhJLsgXjN89Ir6V2PczXNnMpwKhw==",
|
||||
"deprecated": "request has been deprecated, see https://github.com/request/request/issues/3142",
|
||||
"license": "Apache-2.0",
|
||||
"dependencies": {
|
||||
"aws-sign2": "~0.7.0",
|
||||
"aws4": "^1.8.0",
|
||||
|
@ -35638,15 +35639,15 @@
|
|||
}
|
||||
},
|
||||
"node_modules/request/node_modules/tough-cookie": {
|
||||
"version": "2.5.0",
|
||||
"resolved": "https://registry.npmjs.org/tough-cookie/-/tough-cookie-2.5.0.tgz",
|
||||
"integrity": "sha512-nlLsUzgm1kfLXSXfRZMc1KLAugd4hqJHDTvc2hDIwS3mZAfMEuMbc03SujMF+GEcpaX/qboeycw6iO8JwVv2+g==",
|
||||
"version": "5.1.2",
|
||||
"resolved": "https://registry.npmjs.org/tough-cookie/-/tough-cookie-5.1.2.tgz",
|
||||
"integrity": "sha512-FVDYdxtnj0G6Qm/DhNPSb8Ju59ULcup3tuJxkFb5K8Bv2pUXILbf0xZWU8PX8Ov19OXljbUyveOFwRMwkXzO+A==",
|
||||
"license": "BSD-3-Clause",
|
||||
"dependencies": {
|
||||
"psl": "^1.1.28",
|
||||
"punycode": "^2.1.1"
|
||||
"tldts": "^6.1.32"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=0.8"
|
||||
"node": ">=16"
|
||||
}
|
||||
},
|
||||
"node_modules/requestretry": {
|
||||
|
@ -39612,6 +39613,24 @@
|
|||
"tlds": "bin.js"
|
||||
}
|
||||
},
|
||||
"node_modules/tldts": {
|
||||
"version": "6.1.86",
|
||||
"resolved": "https://registry.npmjs.org/tldts/-/tldts-6.1.86.tgz",
|
||||
"integrity": "sha512-WMi/OQ2axVTf/ykqCQgXiIct+mSQDFdH2fkwhPwgEwvJ1kSzZRiinb0zF2Xb8u4+OqPChmyI6MEu4EezNJz+FQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"tldts-core": "^6.1.86"
|
||||
},
|
||||
"bin": {
|
||||
"tldts": "bin/cli.js"
|
||||
}
|
||||
},
|
||||
"node_modules/tldts-core": {
|
||||
"version": "6.1.86",
|
||||
"resolved": "https://registry.npmjs.org/tldts-core/-/tldts-core-6.1.86.tgz",
|
||||
"integrity": "sha512-Je6p7pkk+KMzMv2XXKmAE3McmolOQFdxkKw0R8EYNr7sELW46JqnNeTX8ybPiQgvg1ymCoF8LXs5fzFaZvJPTA==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/tmp": {
|
||||
"version": "0.2.3",
|
||||
"resolved": "https://registry.npmjs.org/tmp/-/tmp-0.2.3.tgz",
|
||||
|
|
|
@ -33,6 +33,9 @@
|
|||
"path-to-regexp": "3.3.0",
|
||||
"body-parser": "1.20.3",
|
||||
"multer": "2.0.1"
|
||||
},
|
||||
"request@2.88.2": {
|
||||
"tough-cookie": "5.1.2"
|
||||
}
|
||||
},
|
||||
"scripts": {
|
||||
|
|
23
patches/@node-saml+node-saml+4.0.5.patch
Normal file
23
patches/@node-saml+node-saml+4.0.5.patch
Normal file
|
@ -0,0 +1,23 @@
|
|||
diff --git a/node_modules/@node-saml/node-saml/lib/saml.js b/node_modules/@node-saml/node-saml/lib/saml.js
|
||||
index fba15b9..a5778cb 100644
|
||||
--- a/node_modules/@node-saml/node-saml/lib/saml.js
|
||||
+++ b/node_modules/@node-saml/node-saml/lib/saml.js
|
||||
@@ -336,7 +336,8 @@ class SAML {
|
||||
const requestOrResponse = request || response;
|
||||
(0, utility_1.assertRequired)(requestOrResponse, "either request or response is required");
|
||||
let buffer;
|
||||
- if (this.options.skipRequestCompression) {
|
||||
+ // logout requestOrResponse must be compressed anyway
|
||||
+ if (this.options.skipRequestCompression && operation !== "logout") {
|
||||
buffer = Buffer.from(requestOrResponse, "utf8");
|
||||
}
|
||||
else {
|
||||
@@ -495,7 +496,7 @@ class SAML {
|
||||
try {
|
||||
xml = Buffer.from(container.SAMLResponse, "base64").toString("utf8");
|
||||
doc = await (0, xml_1.parseDomFromString)(xml);
|
||||
- const inResponseToNodes = xml_1.xpath.selectAttributes(doc, "/*[local-name()='Response']/@InResponseTo");
|
||||
+ const inResponseToNodes = xml_1.xpath.selectAttributes(doc, "/*[local-name()='Response' or local-name()='LogoutResponse']/@InResponseTo");
|
||||
if (inResponseToNodes) {
|
||||
inResponseTo = inResponseToNodes.length ? inResponseToNodes[0].nodeValue : null;
|
||||
await this.validateInResponseTo(inResponseTo);
|
64
patches/ldapauth-fork+4.3.3.patch
Normal file
64
patches/ldapauth-fork+4.3.3.patch
Normal file
|
@ -0,0 +1,64 @@
|
|||
diff --git a/node_modules/ldapauth-fork/lib/ldapauth.js b/node_modules/ldapauth-fork/lib/ldapauth.js
|
||||
index 85ecf36a8b..a7d07e0f78 100644
|
||||
--- a/node_modules/ldapauth-fork/lib/ldapauth.js
|
||||
+++ b/node_modules/ldapauth-fork/lib/ldapauth.js
|
||||
@@ -69,6 +69,7 @@ function LdapAuth(opts) {
|
||||
this.opts.bindProperty || (this.opts.bindProperty = 'dn');
|
||||
this.opts.groupSearchScope || (this.opts.groupSearchScope = 'sub');
|
||||
this.opts.groupDnProperty || (this.opts.groupDnProperty = 'dn');
|
||||
+ this.opts.tlsStarted = false;
|
||||
|
||||
EventEmitter.call(this);
|
||||
|
||||
@@ -108,21 +109,7 @@ function LdapAuth(opts) {
|
||||
this._userClient.on('error', this._handleError.bind(this));
|
||||
|
||||
var self = this;
|
||||
- if (this.opts.starttls) {
|
||||
- // When starttls is enabled, this callback supplants the 'connect' callback
|
||||
- this._adminClient.starttls(this.opts.tlsOptions, this._adminClient.controls, function(err) {
|
||||
- if (err) {
|
||||
- self._handleError(err);
|
||||
- } else {
|
||||
- self._onConnectAdmin();
|
||||
- }
|
||||
- });
|
||||
- this._userClient.starttls(this.opts.tlsOptions, this._userClient.controls, function(err) {
|
||||
- if (err) {
|
||||
- self._handleError(err);
|
||||
- }
|
||||
- });
|
||||
- } else if (opts.reconnect) {
|
||||
+ if (opts.reconnect && !this.opts.starttls) {
|
||||
this.once('_installReconnectListener', function() {
|
||||
self.log && self.log.trace('install reconnect listener');
|
||||
self._adminClient.on('connect', function() {
|
||||
@@ -384,6 +371,28 @@ LdapAuth.prototype._findGroups = function(user, callback) {
|
||||
*/
|
||||
LdapAuth.prototype.authenticate = function(username, password, callback) {
|
||||
var self = this;
|
||||
+ if (this.opts.starttls && !this.opts.tlsStarted) {
|
||||
+ // When starttls is enabled, this callback supplants the 'connect' callback
|
||||
+ this._adminClient.starttls(this.opts.tlsOptions, this._adminClient.controls, function (err) {
|
||||
+ if (err) {
|
||||
+ self._handleError(err);
|
||||
+ } else {
|
||||
+ self._onConnectAdmin(function(){self._handleAuthenticate(username, password, callback);});
|
||||
+ }
|
||||
+ });
|
||||
+ this._userClient.starttls(this.opts.tlsOptions, this._userClient.controls, function (err) {
|
||||
+ if (err) {
|
||||
+ self._handleError(err);
|
||||
+ }
|
||||
+ });
|
||||
+ } else {
|
||||
+ self._handleAuthenticate(username, password, callback);
|
||||
+ }
|
||||
+};
|
||||
+
|
||||
+LdapAuth.prototype._handleAuthenticate = function (username, password, callback) {
|
||||
+ this.opts.tlsStarted = true;
|
||||
+ var self = this;
|
||||
|
||||
if (typeof password === 'undefined' || password === null || password === '') {
|
||||
return callback(new Error('no password given'));
|
|
@ -24,6 +24,7 @@ build-base:
|
|||
--cache-from $(OVERLEAF_BASE_BRANCH) \
|
||||
--tag $(OVERLEAF_BASE_TAG) \
|
||||
--tag $(OVERLEAF_BASE_BRANCH) \
|
||||
--network=host \
|
||||
$(MONOREPO_ROOT)
|
||||
|
||||
|
||||
|
@ -39,6 +40,7 @@ build-community:
|
|||
--file Dockerfile \
|
||||
--tag $(OVERLEAF_TAG) \
|
||||
--tag $(OVERLEAF_BRANCH) \
|
||||
--network=host \
|
||||
$(MONOREPO_ROOT)
|
||||
|
||||
SHELLCHECK_OPTS = \
|
||||
|
|
|
@ -9,5 +9,6 @@ export HISTORY_V1_HOST=127.0.0.1
|
|||
export NOTIFICATIONS_HOST=127.0.0.1
|
||||
export PROJECT_HISTORY_HOST=127.0.0.1
|
||||
export REALTIME_HOST=127.0.0.1
|
||||
export REFERENCES_HOST=127.0.0.1
|
||||
export WEB_HOST=127.0.0.1
|
||||
export WEB_API_HOST=127.0.0.1
|
||||
|
|
12
server-ce/runit/references-overleaf/run
Executable file
12
server-ce/runit/references-overleaf/run
Executable file
|
@ -0,0 +1,12 @@
|
|||
#!/bin/bash
|
||||
|
||||
NODE_PARAMS=""
|
||||
if [ "$DEBUG_NODE" == "true" ]; then
|
||||
echo "running debug - references"
|
||||
NODE_PARAMS="--inspect=0.0.0.0:30560"
|
||||
fi
|
||||
|
||||
source /etc/overleaf/env.sh
|
||||
export LISTEN_ADDRESS=127.0.0.1
|
||||
|
||||
exec /sbin/setuser www-data /usr/bin/node $NODE_PARAMS /overleaf/services/references/app.js >> /var/log/overleaf/references.log 2>&1
|
|
@ -29,6 +29,9 @@ module.exports = [
|
|||
{
|
||||
name: 'project-history',
|
||||
},
|
||||
{
|
||||
name: 'references',
|
||||
},
|
||||
{
|
||||
name: 'history-v1',
|
||||
},
|
||||
|
|
|
@ -21,9 +21,11 @@ test-e2e-native:
|
|||
|
||||
test-e2e:
|
||||
docker compose build host-admin
|
||||
docker compose up -d host-admin
|
||||
docker compose up --no-log-prefix --exit-code-from=e2e e2e
|
||||
|
||||
test-e2e-open:
|
||||
docker compose up -d host-admin
|
||||
docker compose up --no-log-prefix --exit-code-from=e2e-open e2e-open
|
||||
|
||||
clean:
|
||||
|
|
|
@ -20,7 +20,7 @@ services:
|
|||
OVERLEAF_EMAIL_SMTP_HOST: 'mailtrap'
|
||||
OVERLEAF_EMAIL_SMTP_PORT: '25'
|
||||
OVERLEAF_EMAIL_SMTP_IGNORE_TLS: 'true'
|
||||
ENABLED_LINKED_FILE_TYPES: 'project_file,project_output_file'
|
||||
ENABLED_LINKED_FILE_TYPES: 'project_file,project_output_file,url'
|
||||
ENABLE_CONVERSIONS: 'true'
|
||||
EMAIL_CONFIRMATION_DISABLED: 'true'
|
||||
healthcheck:
|
||||
|
@ -35,7 +35,7 @@ services:
|
|||
MAILTRAP_PASSWORD: 'password-for-mailtrap'
|
||||
|
||||
mongo:
|
||||
image: mongo:6.0
|
||||
image: mongo:8.0.11
|
||||
command: '--replSet overleaf'
|
||||
volumes:
|
||||
- ../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
|
|
|
@ -2,6 +2,7 @@ import {
|
|||
createNewFile,
|
||||
createProject,
|
||||
openProjectById,
|
||||
testNewFileUpload,
|
||||
} from './helpers/project'
|
||||
import { isExcludedBySharding, startWith } from './helpers/config'
|
||||
import { ensureUserExists, login } from './helpers/login'
|
||||
|
@ -119,24 +120,7 @@ describe('editor', () => {
|
|||
cy.get('button').contains('New file').click({ force: true })
|
||||
})
|
||||
|
||||
it('can upload file', () => {
|
||||
const name = `${uuid()}.txt`
|
||||
const content = `Test File Content ${name}`
|
||||
cy.get('button').contains('Upload').click({ force: true })
|
||||
cy.get('input[type=file]')
|
||||
.first()
|
||||
.selectFile(
|
||||
{
|
||||
contents: Cypress.Buffer.from(content),
|
||||
fileName: name,
|
||||
lastModified: Date.now(),
|
||||
},
|
||||
{ force: true }
|
||||
)
|
||||
// force: The file-tree pane is too narrow to display the full name.
|
||||
cy.findByTestId('file-tree').findByText(name).click({ force: true })
|
||||
cy.findByText(content)
|
||||
})
|
||||
testNewFileUpload()
|
||||
|
||||
it('should not display import from URL', () => {
|
||||
cy.findByText('From external URL').should('not.exist')
|
||||
|
|
104
server-ce/test/filestore-migration.spec.ts
Normal file
104
server-ce/test/filestore-migration.spec.ts
Normal file
|
@ -0,0 +1,104 @@
|
|||
import { ensureUserExists, login } from './helpers/login'
|
||||
import {
|
||||
createProject,
|
||||
openProjectById,
|
||||
prepareFileUploadTest,
|
||||
} from './helpers/project'
|
||||
import { isExcludedBySharding, startWith } from './helpers/config'
|
||||
import { prepareWaitForNextCompileSlot } from './helpers/compile'
|
||||
import { beforeWithReRunOnTestRetry } from './helpers/beforeWithReRunOnTestRetry'
|
||||
import { v4 as uuid } from 'uuid'
|
||||
import { purgeFilestoreData, runScript } from './helpers/hostAdminClient'
|
||||
|
||||
describe('filestore migration', function () {
|
||||
if (isExcludedBySharding('CE_CUSTOM_3')) return
|
||||
startWith({ withDataDir: true, resetData: true, vars: {} })
|
||||
ensureUserExists({ email: 'user@example.com' })
|
||||
|
||||
let projectName: string
|
||||
let projectId: string
|
||||
let waitForCompileRateLimitCoolOff: (fn: () => void) => void
|
||||
const previousBinaryFiles: (() => void)[] = []
|
||||
beforeWithReRunOnTestRetry(function () {
|
||||
projectName = `project-${uuid()}`
|
||||
login('user@example.com')
|
||||
createProject(projectName, { type: 'Example project' }).then(
|
||||
id => (projectId = id)
|
||||
)
|
||||
let queueReset
|
||||
;({ waitForCompileRateLimitCoolOff, queueReset } =
|
||||
prepareWaitForNextCompileSlot())
|
||||
queueReset()
|
||||
previousBinaryFiles.push(prepareFileUploadTest(true))
|
||||
})
|
||||
|
||||
beforeEach(() => {
|
||||
login('user@example.com')
|
||||
waitForCompileRateLimitCoolOff(() => {
|
||||
openProjectById(projectId)
|
||||
})
|
||||
})
|
||||
|
||||
function checkFilesAreAccessible() {
|
||||
it('can upload new binary file and read previous uploads', function () {
|
||||
previousBinaryFiles.push(prepareFileUploadTest(true))
|
||||
for (const check of previousBinaryFiles) {
|
||||
check()
|
||||
}
|
||||
})
|
||||
|
||||
it('renders frog jpg', () => {
|
||||
cy.findByTestId('file-tree').findByText('frog.jpg').click()
|
||||
cy.get('[alt="frog.jpg"]')
|
||||
.should('be.visible')
|
||||
.and('have.prop', 'naturalWidth')
|
||||
.should('be.greaterThan', 0)
|
||||
})
|
||||
}
|
||||
|
||||
describe('OVERLEAF_FILESTORE_MIGRATION_LEVEL not set', function () {
|
||||
startWith({ withDataDir: true, vars: {} })
|
||||
checkFilesAreAccessible()
|
||||
})
|
||||
|
||||
describe('OVERLEAF_FILESTORE_MIGRATION_LEVEL=0', function () {
|
||||
startWith({
|
||||
withDataDir: true,
|
||||
vars: { OVERLEAF_FILESTORE_MIGRATION_LEVEL: '0' },
|
||||
})
|
||||
checkFilesAreAccessible()
|
||||
|
||||
describe('OVERLEAF_FILESTORE_MIGRATION_LEVEL=1', function () {
|
||||
startWith({
|
||||
withDataDir: true,
|
||||
vars: { OVERLEAF_FILESTORE_MIGRATION_LEVEL: '1' },
|
||||
})
|
||||
checkFilesAreAccessible()
|
||||
|
||||
describe('OVERLEAF_FILESTORE_MIGRATION_LEVEL=2', function () {
|
||||
startWith({
|
||||
withDataDir: true,
|
||||
vars: { OVERLEAF_FILESTORE_MIGRATION_LEVEL: '1' },
|
||||
})
|
||||
before(async function () {
|
||||
await runScript({
|
||||
cwd: 'services/history-v1',
|
||||
script: 'storage/scripts/back_fill_file_hash.mjs',
|
||||
})
|
||||
})
|
||||
startWith({
|
||||
withDataDir: true,
|
||||
vars: { OVERLEAF_FILESTORE_MIGRATION_LEVEL: '2' },
|
||||
})
|
||||
checkFilesAreAccessible()
|
||||
|
||||
describe('purge filestore data', function () {
|
||||
before(async function () {
|
||||
await purgeFilestoreData()
|
||||
})
|
||||
checkFilesAreAccessible()
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
||||
})
|
|
@ -9,6 +9,7 @@ export function isExcludedBySharding(
|
|||
| 'CE_DEFAULT'
|
||||
| 'CE_CUSTOM_1'
|
||||
| 'CE_CUSTOM_2'
|
||||
| 'CE_CUSTOM_3'
|
||||
| 'PRO_DEFAULT_1'
|
||||
| 'PRO_DEFAULT_2'
|
||||
| 'PRO_CUSTOM_1'
|
||||
|
|
|
@ -85,6 +85,12 @@ export async function getRedisKeys() {
|
|||
return stdout.split('\n')
|
||||
}
|
||||
|
||||
export async function purgeFilestoreData() {
|
||||
await fetchJSON(`${hostAdminURL}/data/user_files`, {
|
||||
method: 'DELETE',
|
||||
})
|
||||
}
|
||||
|
||||
async function sleep(ms: number) {
|
||||
return new Promise(resolve => {
|
||||
setTimeout(resolve, ms)
|
||||
|
|
|
@ -216,3 +216,43 @@ export function createNewFile() {
|
|||
|
||||
return fileName
|
||||
}
|
||||
|
||||
export function prepareFileUploadTest(binary = false) {
|
||||
const name = `${uuid()}.txt`
|
||||
const content = `Test File Content ${name}${binary ? ' \x00' : ''}`
|
||||
cy.get('button').contains('Upload').click({ force: true })
|
||||
cy.get('input[type=file]')
|
||||
.first()
|
||||
.selectFile(
|
||||
{
|
||||
contents: Cypress.Buffer.from(content),
|
||||
fileName: name,
|
||||
lastModified: Date.now(),
|
||||
},
|
||||
{ force: true }
|
||||
)
|
||||
|
||||
// wait for the upload to finish
|
||||
cy.findByRole('treeitem', { name })
|
||||
|
||||
return function check() {
|
||||
cy.findByRole('treeitem', { name }).click()
|
||||
if (binary) {
|
||||
cy.findByText(content).should('not.have.class', 'cm-line')
|
||||
} else {
|
||||
cy.findByText(content).should('have.class', 'cm-line')
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export function testNewFileUpload() {
|
||||
it('can upload text file', () => {
|
||||
const check = prepareFileUploadTest(false)
|
||||
check()
|
||||
})
|
||||
|
||||
it('can upload binary file', () => {
|
||||
const check = prepareFileUploadTest(true)
|
||||
check()
|
||||
})
|
||||
}
|
||||
|
|
|
@ -29,6 +29,17 @@ const IMAGES = {
|
|||
PRO: process.env.IMAGE_TAG_PRO.replace(/:.+/, ''),
|
||||
}
|
||||
|
||||
function defaultDockerComposeOverride() {
|
||||
return {
|
||||
services: {
|
||||
sharelatex: {
|
||||
environment: {},
|
||||
},
|
||||
'git-bridge': {},
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
let previousConfig = ''
|
||||
|
||||
function readDockerComposeOverride() {
|
||||
|
@ -38,14 +49,7 @@ function readDockerComposeOverride() {
|
|||
if (error.code !== 'ENOENT') {
|
||||
throw error
|
||||
}
|
||||
return {
|
||||
services: {
|
||||
sharelatex: {
|
||||
environment: {},
|
||||
},
|
||||
'git-bridge': {},
|
||||
},
|
||||
}
|
||||
return defaultDockerComposeOverride
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -77,12 +81,21 @@ app.use(bodyParser.json())
|
|||
app.use((req, res, next) => {
|
||||
// Basic access logs
|
||||
console.log(req.method, req.url, req.body)
|
||||
const json = res.json
|
||||
res.json = body => {
|
||||
console.log(req.method, req.url, req.body, '->', body)
|
||||
json.call(res, body)
|
||||
}
|
||||
next()
|
||||
})
|
||||
app.use((req, res, next) => {
|
||||
// Add CORS headers
|
||||
const accessControlAllowOrigin =
|
||||
process.env.ACCESS_CONTROL_ALLOW_ORIGIN || 'http://sharelatex'
|
||||
res.setHeader('Access-Control-Allow-Origin', accessControlAllowOrigin)
|
||||
res.setHeader('Access-Control-Allow-Headers', 'Content-Type')
|
||||
res.setHeader('Access-Control-Max-Age', '3600')
|
||||
res.setHeader('Access-Control-Allow-Methods', 'DELETE, GET, HEAD, POST, PUT')
|
||||
next()
|
||||
})
|
||||
|
||||
|
@ -133,6 +146,7 @@ const allowedVars = Joi.object(
|
|||
'V1_HISTORY_URL',
|
||||
'SANDBOXED_COMPILES',
|
||||
'ALL_TEX_LIVE_DOCKER_IMAGE_NAMES',
|
||||
'OVERLEAF_FILESTORE_MIGRATION_LEVEL',
|
||||
'OVERLEAF_TEMPLATES_USER_ID',
|
||||
'OVERLEAF_NEW_PROJECT_TEMPLATE_LINKS',
|
||||
'OVERLEAF_ALLOW_PUBLIC_ACCESS',
|
||||
|
@ -319,8 +333,19 @@ app.get('/redis/keys', (req, res) => {
|
|||
)
|
||||
})
|
||||
|
||||
app.delete('/data/user_files', (req, res) => {
|
||||
runDockerCompose(
|
||||
'exec',
|
||||
['sharelatex', 'rm', '-rf', '/var/lib/overleaf/data/user_files'],
|
||||
(error, stdout, stderr) => {
|
||||
res.json({ error, stdout, stderr })
|
||||
}
|
||||
)
|
||||
})
|
||||
|
||||
app.use(handleValidationErrors())
|
||||
|
||||
purgeDataDir()
|
||||
writeDockerComposeOverride(defaultDockerComposeOverride())
|
||||
|
||||
app.listen(80)
|
||||
|
|
|
@ -42,7 +42,7 @@ services:
|
|||
command: tar -czf /tmp/build/build.tar.gz --exclude=build.tar.gz --exclude-vcs .
|
||||
user: root
|
||||
mongo:
|
||||
image: mongo:7.0.20
|
||||
image: mongo:8.0.11
|
||||
command: --replSet overleaf
|
||||
volumes:
|
||||
- ../../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
|
|
|
@ -44,7 +44,7 @@ services:
|
|||
command: npm run --silent test:acceptance
|
||||
|
||||
mongo:
|
||||
image: mongo:7.0.20
|
||||
image: mongo:8.0.11
|
||||
command: --replSet overleaf
|
||||
volumes:
|
||||
- ../../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
|
|
|
@ -129,7 +129,7 @@ function compile(req, res, next) {
|
|||
compiler: request.compiler,
|
||||
draft: request.draft,
|
||||
imageName: request.imageName
|
||||
? Path.basename(request.imageName)
|
||||
? request.imageName
|
||||
: undefined,
|
||||
rootResourcePath: request.rootResourcePath,
|
||||
stopOnFirstError: request.stopOnFirstError,
|
||||
|
|
|
@ -232,8 +232,8 @@ const DockerRunner = {
|
|||
}
|
||||
}
|
||||
// set the path based on the image year
|
||||
const match = image.match(/:([0-9]+)\.[0-9]+/)
|
||||
const year = match ? match[1] : '2014'
|
||||
const match = image.match(/:([0-9]+)\.[0-9]+|:TL([0-9]+)/)
|
||||
const year = match ? match[1] || match[2] : '2014'
|
||||
env.PATH = `/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/texlive/${year}/bin/x86_64-linux/`
|
||||
const options = {
|
||||
Cmd: command,
|
||||
|
|
|
@ -107,7 +107,7 @@ if ((process.env.DOCKER_RUNNER || process.env.SANDBOXED_COMPILES) === 'true') {
|
|||
CLSI: 1,
|
||||
},
|
||||
socketPath: '/var/run/docker.sock',
|
||||
user: process.env.TEXLIVE_IMAGE_USER || 'tex',
|
||||
user: process.env.TEXLIVE_IMAGE_USER || 'www-data',
|
||||
},
|
||||
optimiseInDocker: true,
|
||||
expireProjectAfterIdleMs: 24 * 60 * 60 * 1000,
|
||||
|
|
|
@ -829,13 +829,19 @@
|
|||
"args": []
|
||||
},
|
||||
{
|
||||
"name": "gettimeofday",
|
||||
"action": "SCMP_ACT_ALLOW",
|
||||
"args": []
|
||||
}, {
|
||||
"name": "epoll_pwait",
|
||||
"action": "SCMP_ACT_ALLOW",
|
||||
"args": []
|
||||
"name": "gettimeofday",
|
||||
"action": "SCMP_ACT_ALLOW",
|
||||
"args": []
|
||||
},
|
||||
{
|
||||
"name": "epoll_pwait",
|
||||
"action": "SCMP_ACT_ALLOW",
|
||||
"args": []
|
||||
},
|
||||
{
|
||||
"name": "poll",
|
||||
"action": "SCMP_ACT_ALLOW",
|
||||
"args": []
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
|
|
|
@ -42,7 +42,7 @@ services:
|
|||
command: tar -czf /tmp/build/build.tar.gz --exclude=build.tar.gz --exclude-vcs .
|
||||
user: root
|
||||
mongo:
|
||||
image: mongo:7.0.20
|
||||
image: mongo:8.0.11
|
||||
command: --replSet overleaf
|
||||
volumes:
|
||||
- ../../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
|
|
|
@ -44,7 +44,7 @@ services:
|
|||
command: npm run --silent test:acceptance
|
||||
|
||||
mongo:
|
||||
image: mongo:7.0.20
|
||||
image: mongo:8.0.11
|
||||
command: --replSet overleaf
|
||||
volumes:
|
||||
- ../../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
|
|
|
@ -47,7 +47,7 @@ services:
|
|||
command: tar -czf /tmp/build/build.tar.gz --exclude=build.tar.gz --exclude-vcs .
|
||||
user: root
|
||||
mongo:
|
||||
image: mongo:7.0.20
|
||||
image: mongo:8.0.11
|
||||
command: --replSet overleaf
|
||||
volumes:
|
||||
- ../../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
|
|
|
@ -49,7 +49,7 @@ services:
|
|||
command: npm run --silent test:acceptance
|
||||
|
||||
mongo:
|
||||
image: mongo:7.0.20
|
||||
image: mongo:8.0.11
|
||||
command: --replSet overleaf
|
||||
volumes:
|
||||
- ../../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
|
|
|
@ -55,7 +55,7 @@ services:
|
|||
retries: 20
|
||||
|
||||
mongo:
|
||||
image: mongo:7.0.20
|
||||
image: mongo:8.0.11
|
||||
command: --replSet overleaf
|
||||
volumes:
|
||||
- ../../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
|
|
|
@ -57,7 +57,7 @@ services:
|
|||
retries: 20
|
||||
|
||||
mongo:
|
||||
image: mongo:7.0.20
|
||||
image: mongo:8.0.11
|
||||
command: --replSet overleaf
|
||||
volumes:
|
||||
- ../../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
|
|
|
@ -111,6 +111,11 @@ if (settings.filestore.stores.template_files) {
|
|||
keyBuilder.templateFileKeyMiddleware,
|
||||
fileController.insertFile
|
||||
)
|
||||
app.delete(
|
||||
'/template/:template_id/v/:version/:format',
|
||||
keyBuilder.templateFileKeyMiddleware,
|
||||
fileController.deleteFile
|
||||
)
|
||||
}
|
||||
|
||||
app.get(
|
||||
|
|
|
@ -5,7 +5,7 @@ const { callbackify } = require('node:util')
|
|||
const safeExec = require('./SafeExec').promises
|
||||
const { ConversionError } = require('./Errors')
|
||||
|
||||
const APPROVED_FORMATS = ['png']
|
||||
const APPROVED_FORMATS = ['png', 'jpg']
|
||||
const FOURTY_SECONDS = 40 * 1000
|
||||
const KILL_SIGNAL = 'SIGTERM'
|
||||
|
||||
|
@ -34,16 +34,14 @@ async function convert(sourcePath, requestedFormat) {
|
|||
}
|
||||
|
||||
async function thumbnail(sourcePath) {
|
||||
const width = '260x'
|
||||
return await convert(sourcePath, 'png', [
|
||||
const width = '548x'
|
||||
return await _convert(sourcePath, 'jpg', [
|
||||
'convert',
|
||||
'-flatten',
|
||||
'-background',
|
||||
'white',
|
||||
'-density',
|
||||
'300',
|
||||
'-define',
|
||||
`pdf:fit-page=${width}`,
|
||||
`${sourcePath}[0]`,
|
||||
'-resize',
|
||||
width,
|
||||
|
@ -51,16 +49,14 @@ async function thumbnail(sourcePath) {
|
|||
}
|
||||
|
||||
async function preview(sourcePath) {
|
||||
const width = '548x'
|
||||
return await convert(sourcePath, 'png', [
|
||||
const width = '794x'
|
||||
return await _convert(sourcePath, 'jpg', [
|
||||
'convert',
|
||||
'-flatten',
|
||||
'-background',
|
||||
'white',
|
||||
'-density',
|
||||
'300',
|
||||
'-define',
|
||||
`pdf:fit-page=${width}`,
|
||||
`${sourcePath}[0]`,
|
||||
'-resize',
|
||||
width,
|
||||
|
|
|
@ -150,7 +150,9 @@ async function _getConvertedFileAndCache(bucket, key, convertedKey, opts) {
|
|||
let convertedFsPath
|
||||
try {
|
||||
convertedFsPath = await _convertFile(bucket, key, opts)
|
||||
await ImageOptimiser.promises.compressPng(convertedFsPath)
|
||||
if (convertedFsPath.toLowerCase().endsWith(".png")) {
|
||||
await ImageOptimiser.promises.compressPng(convertedFsPath)
|
||||
}
|
||||
await PersistorManager.sendFile(bucket, convertedKey, convertedFsPath)
|
||||
} catch (err) {
|
||||
LocalFileWriter.deleteFile(convertedFsPath, () => {})
|
||||
|
|
|
@ -75,7 +75,7 @@ services:
|
|||
retries: 20
|
||||
|
||||
mongo:
|
||||
image: mongo:7.0.20
|
||||
image: mongo:8.0.11
|
||||
command: --replSet overleaf
|
||||
volumes:
|
||||
- ../../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
|
|
|
@ -83,7 +83,7 @@ services:
|
|||
retries: 20
|
||||
|
||||
mongo:
|
||||
image: mongo:7.0.20
|
||||
image: mongo:8.0.11
|
||||
command: --replSet overleaf
|
||||
volumes:
|
||||
- ../../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
|
|
|
@ -150,10 +150,6 @@ const CONCURRENT_BATCHES = parseInt(process.env.CONCURRENT_BATCHES || '2', 10)
|
|||
const RETRIES = parseInt(process.env.RETRIES || '10', 10)
|
||||
const RETRY_DELAY_MS = parseInt(process.env.RETRY_DELAY_MS || '100', 10)
|
||||
|
||||
const USER_FILES_BUCKET_NAME = process.env.USER_FILES_BUCKET_NAME || ''
|
||||
if (!USER_FILES_BUCKET_NAME) {
|
||||
throw new Error('env var USER_FILES_BUCKET_NAME is missing')
|
||||
}
|
||||
const RETRY_FILESTORE_404 = process.env.RETRY_FILESTORE_404 === 'true'
|
||||
const BUFFER_DIR = fs.mkdtempSync(
|
||||
process.env.BUFFER_DIR_PREFIX || '/tmp/back_fill_file_hash-'
|
||||
|
|
|
@ -9,15 +9,12 @@ import { Blob } from 'overleaf-editor-core'
|
|||
import {
|
||||
BlobStore,
|
||||
getStringLengthOfFile,
|
||||
GLOBAL_BLOBS,
|
||||
makeBlobForFile,
|
||||
} from '../lib/blob_store/index.js'
|
||||
import { db } from '../lib/mongodb.js'
|
||||
import commandLineArgs from 'command-line-args'
|
||||
import readline from 'node:readline'
|
||||
import { _blobIsBackedUp, backupBlob } from '../lib/backupBlob.mjs'
|
||||
import { NotFoundError } from '@overleaf/object-persistor/src/Errors.js'
|
||||
import filestorePersistor from '../lib/persistor.js'
|
||||
import { setTimeout } from 'node:timers/promises'
|
||||
|
||||
// Silence warning.
|
||||
|
@ -52,12 +49,11 @@ ObjectId.cacheHexString = true
|
|||
*/
|
||||
|
||||
/**
|
||||
* @return {{FIX_NOT_FOUND: boolean, FIX_HASH_MISMATCH: boolean, FIX_DELETE_PERMISSION: boolean, FIX_MISSING_HASH: boolean, LOGS: string}}
|
||||
* @return {{FIX_NOT_FOUND: boolean, FIX_HASH_MISMATCH: boolean, FIX_MISSING_HASH: boolean, LOGS: string}}
|
||||
*/
|
||||
function parseArgs() {
|
||||
const args = commandLineArgs([
|
||||
{ name: 'fixNotFound', type: String, defaultValue: 'true' },
|
||||
{ name: 'fixDeletePermission', type: String, defaultValue: 'true' },
|
||||
{ name: 'fixHashMismatch', type: String, defaultValue: 'true' },
|
||||
{ name: 'fixMissingHash', type: String, defaultValue: 'true' },
|
||||
{ name: 'logs', type: String, defaultValue: '' },
|
||||
|
@ -74,20 +70,13 @@ function parseArgs() {
|
|||
}
|
||||
return {
|
||||
FIX_HASH_MISMATCH: boolVal('fixNotFound'),
|
||||
FIX_DELETE_PERMISSION: boolVal('fixDeletePermission'),
|
||||
FIX_NOT_FOUND: boolVal('fixHashMismatch'),
|
||||
FIX_MISSING_HASH: boolVal('fixMissingHash'),
|
||||
LOGS: args.logs,
|
||||
}
|
||||
}
|
||||
|
||||
const {
|
||||
FIX_HASH_MISMATCH,
|
||||
FIX_DELETE_PERMISSION,
|
||||
FIX_NOT_FOUND,
|
||||
FIX_MISSING_HASH,
|
||||
LOGS,
|
||||
} = parseArgs()
|
||||
const { FIX_HASH_MISMATCH, FIX_NOT_FOUND, FIX_MISSING_HASH, LOGS } = parseArgs()
|
||||
if (!LOGS) {
|
||||
throw new Error('--logs parameter missing')
|
||||
}
|
||||
|
@ -105,6 +94,37 @@ const STREAM_HIGH_WATER_MARK = parseInt(
|
|||
)
|
||||
const SLEEP_BEFORE_EXIT = parseInt(process.env.SLEEP_BEFORE_EXIT || '1000', 10)
|
||||
|
||||
// Filestore endpoint location
|
||||
const FILESTORE_HOST = process.env.FILESTORE_HOST || '127.0.0.1'
|
||||
const FILESTORE_PORT = process.env.FILESTORE_PORT || '3009'
|
||||
|
||||
async function fetchFromFilestore(projectId, fileId) {
|
||||
const url = `http://${FILESTORE_HOST}:${FILESTORE_PORT}/project/${projectId}/file/${fileId}`
|
||||
const response = await fetch(url)
|
||||
if (!response.ok) {
|
||||
if (response.status === 404) {
|
||||
throw new NotFoundError('file not found in filestore', {
|
||||
status: response.status,
|
||||
})
|
||||
}
|
||||
const body = await response.text()
|
||||
throw new OError('fetchFromFilestore failed', {
|
||||
projectId,
|
||||
fileId,
|
||||
status: response.status,
|
||||
body,
|
||||
})
|
||||
}
|
||||
if (!response.body) {
|
||||
throw new OError('fetchFromFilestore response has no body', {
|
||||
projectId,
|
||||
fileId,
|
||||
status: response.status,
|
||||
})
|
||||
}
|
||||
return response.body
|
||||
}
|
||||
|
||||
/** @type {ProjectsCollection} */
|
||||
const projectsCollection = db.collection('projects')
|
||||
/** @type {DeletedProjectsCollection} */
|
||||
|
@ -302,19 +322,16 @@ async function setHashInMongo(projectId, fileId, hash) {
|
|||
* @return {Promise<void>}
|
||||
*/
|
||||
async function importRestoredFilestoreFile(projectId, fileId, historyId) {
|
||||
const filestoreKey = `${projectId}/${fileId}`
|
||||
const path = `${BUFFER_DIR}/${projectId}_${fileId}`
|
||||
try {
|
||||
let s
|
||||
try {
|
||||
s = await filestorePersistor.getObjectStream(
|
||||
USER_FILES_BUCKET_NAME,
|
||||
filestoreKey
|
||||
)
|
||||
s = await fetchFromFilestore(projectId, fileId)
|
||||
} catch (err) {
|
||||
if (err instanceof NotFoundError) {
|
||||
throw new OError('missing blob, need to restore filestore file', {
|
||||
filestoreKey,
|
||||
projectId,
|
||||
fileId,
|
||||
})
|
||||
}
|
||||
throw err
|
||||
|
@ -325,7 +342,6 @@ async function importRestoredFilestoreFile(projectId, fileId, historyId) {
|
|||
)
|
||||
const blobStore = new BlobStore(historyId)
|
||||
const blob = await blobStore.putFile(path)
|
||||
await backupBlob(historyId, blob, path)
|
||||
await setHashInMongo(projectId, fileId, blob.getHash())
|
||||
} finally {
|
||||
await fs.promises.rm(path, { force: true })
|
||||
|
@ -339,13 +355,9 @@ async function importRestoredFilestoreFile(projectId, fileId, historyId) {
|
|||
* @return {Promise<Blob>}
|
||||
*/
|
||||
async function bufferFilestoreFileToDisk(projectId, fileId, path) {
|
||||
const filestoreKey = `${projectId}/${fileId}`
|
||||
try {
|
||||
await Stream.promises.pipeline(
|
||||
await filestorePersistor.getObjectStream(
|
||||
USER_FILES_BUCKET_NAME,
|
||||
filestoreKey
|
||||
),
|
||||
await fetchFromFilestore(projectId, fileId),
|
||||
fs.createWriteStream(path, { highWaterMark: STREAM_HIGH_WATER_MARK })
|
||||
)
|
||||
const blob = await makeBlobForFile(path)
|
||||
|
@ -356,7 +368,8 @@ async function bufferFilestoreFileToDisk(projectId, fileId, path) {
|
|||
} catch (err) {
|
||||
if (err instanceof NotFoundError) {
|
||||
throw new OError('missing blob, need to restore filestore file', {
|
||||
filestoreKey,
|
||||
projectId,
|
||||
fileId,
|
||||
})
|
||||
}
|
||||
throw err
|
||||
|
@ -389,7 +402,7 @@ async function uploadFilestoreFile(projectId, fileId) {
|
|||
const blob = await bufferFilestoreFileToDisk(projectId, fileId, path)
|
||||
const hash = blob.getHash()
|
||||
try {
|
||||
await ensureBlobExistsForFileAndUploadToAWS(projectId, fileId, hash)
|
||||
await ensureBlobExistsForFile(projectId, fileId, hash)
|
||||
} catch (err) {
|
||||
if (!(err instanceof Blob.NotFoundError)) throw err
|
||||
|
||||
|
@ -397,7 +410,7 @@ async function uploadFilestoreFile(projectId, fileId) {
|
|||
const historyId = project.overleaf.history.id.toString()
|
||||
const blobStore = new BlobStore(historyId)
|
||||
await blobStore.putBlob(path, blob)
|
||||
await ensureBlobExistsForFileAndUploadToAWS(projectId, fileId, hash)
|
||||
await ensureBlobExistsForFile(projectId, fileId, hash)
|
||||
}
|
||||
} finally {
|
||||
await fs.promises.rm(path, { force: true })
|
||||
|
@ -426,11 +439,7 @@ async function fixHashMismatch(line) {
|
|||
await importRestoredFilestoreFile(projectId, fileId, historyId)
|
||||
return true
|
||||
}
|
||||
return await ensureBlobExistsForFileAndUploadToAWS(
|
||||
projectId,
|
||||
fileId,
|
||||
computedHash
|
||||
)
|
||||
return await ensureBlobExistsForFile(projectId, fileId, computedHash)
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -444,30 +453,19 @@ async function hashAlreadyUpdatedInFileTree(projectId, fileId, hash) {
|
|||
return fileRef.hash === hash
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {string} projectId
|
||||
* @param {string} hash
|
||||
* @return {Promise<boolean>}
|
||||
*/
|
||||
async function needsBackingUpToAWS(projectId, hash) {
|
||||
if (GLOBAL_BLOBS.has(hash)) return false
|
||||
return !(await _blobIsBackedUp(projectId, hash))
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {string} projectId
|
||||
* @param {string} fileId
|
||||
* @param {string} hash
|
||||
* @return {Promise<boolean>}
|
||||
*/
|
||||
async function ensureBlobExistsForFileAndUploadToAWS(projectId, fileId, hash) {
|
||||
async function ensureBlobExistsForFile(projectId, fileId, hash) {
|
||||
const { project } = await getProject(projectId)
|
||||
const historyId = project.overleaf.history.id.toString()
|
||||
const blobStore = new BlobStore(historyId)
|
||||
if (
|
||||
(await hashAlreadyUpdatedInFileTree(projectId, fileId, hash)) &&
|
||||
(await blobStore.getBlob(hash)) &&
|
||||
!(await needsBackingUpToAWS(projectId, hash))
|
||||
(await blobStore.getBlob(hash))
|
||||
) {
|
||||
return false // already processed
|
||||
}
|
||||
|
@ -488,7 +486,7 @@ async function ensureBlobExistsForFileAndUploadToAWS(projectId, fileId, hash) {
|
|||
)
|
||||
if (writtenBlob.getHash() !== hash) {
|
||||
// Double check download, better safe than sorry.
|
||||
throw new OError('blob corrupted', { writtenBlob })
|
||||
throw new OError('blob corrupted', { writtenBlob, hash })
|
||||
}
|
||||
|
||||
let blob = await blobStore.getBlob(hash)
|
||||
|
@ -497,7 +495,6 @@ async function ensureBlobExistsForFileAndUploadToAWS(projectId, fileId, hash) {
|
|||
// HACK: Skip upload to GCS and finalize putBlob operation directly.
|
||||
await blobStore.backend.insertBlob(historyId, writtenBlob)
|
||||
}
|
||||
await backupBlob(historyId, writtenBlob, path)
|
||||
} finally {
|
||||
await fs.promises.rm(path, { force: true })
|
||||
}
|
||||
|
@ -505,16 +502,6 @@ async function ensureBlobExistsForFileAndUploadToAWS(projectId, fileId, hash) {
|
|||
return true
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {string} line
|
||||
* @return {Promise<boolean>}
|
||||
*/
|
||||
async function fixDeletePermission(line) {
|
||||
let { projectId, fileId, hash } = JSON.parse(line)
|
||||
if (!hash) hash = await computeFilestoreFileHash(projectId, fileId)
|
||||
return await ensureBlobExistsForFileAndUploadToAWS(projectId, fileId, hash)
|
||||
}
|
||||
|
||||
/**
|
||||
* @param {string} line
|
||||
* @return {Promise<boolean>}
|
||||
|
@ -526,7 +513,7 @@ async function fixMissingHash(line) {
|
|||
} = await findFile(projectId, fileId)
|
||||
if (hash) {
|
||||
// processed, double check
|
||||
return await ensureBlobExistsForFileAndUploadToAWS(projectId, fileId, hash)
|
||||
return await ensureBlobExistsForFile(projectId, fileId, hash)
|
||||
}
|
||||
await uploadFilestoreFile(projectId, fileId)
|
||||
return true
|
||||
|
@ -543,11 +530,6 @@ const CASES = {
|
|||
flag: FIX_HASH_MISMATCH,
|
||||
action: fixHashMismatch,
|
||||
},
|
||||
'delete permission': {
|
||||
match: 'storage.objects.delete',
|
||||
flag: FIX_DELETE_PERMISSION,
|
||||
action: fixDeletePermission,
|
||||
},
|
||||
'missing file hash': {
|
||||
match: '"bad file hash"',
|
||||
flag: FIX_MISSING_HASH,
|
||||
|
|
|
@ -20,7 +20,7 @@ import {
|
|||
makeProjectKey,
|
||||
} from '../../../../storage/lib/blob_store/index.js'
|
||||
|
||||
import express from 'express'
|
||||
import { mockFilestore } from './support/MockFilestore.mjs'
|
||||
|
||||
chai.use(chaiExclude)
|
||||
const TIMEOUT = 20 * 1_000
|
||||
|
@ -28,59 +28,6 @@ const TIMEOUT = 20 * 1_000
|
|||
const projectsCollection = db.collection('projects')
|
||||
const deletedProjectsCollection = db.collection('deletedProjects')
|
||||
|
||||
class MockFilestore {
|
||||
constructor() {
|
||||
this.host = process.env.FILESTORE_HOST || '127.0.0.1'
|
||||
this.port = process.env.FILESTORE_PORT || 3009
|
||||
// create a server listening on this.host and this.port
|
||||
this.files = {}
|
||||
|
||||
this.app = express()
|
||||
|
||||
this.app.get('/project/:projectId/file/:fileId', (req, res) => {
|
||||
const { projectId, fileId } = req.params
|
||||
const content = this.files[projectId]?.[fileId]
|
||||
if (!content) return res.status(404).end()
|
||||
res.status(200).end(content)
|
||||
})
|
||||
}
|
||||
|
||||
start() {
|
||||
// reset stored files
|
||||
this.files = {}
|
||||
// start the server
|
||||
if (this.serverPromise) {
|
||||
return this.serverPromise
|
||||
} else {
|
||||
this.serverPromise = new Promise((resolve, reject) => {
|
||||
this.server = this.app.listen(this.port, this.host, err => {
|
||||
if (err) return reject(err)
|
||||
resolve()
|
||||
})
|
||||
})
|
||||
return this.serverPromise
|
||||
}
|
||||
}
|
||||
|
||||
addFile(projectId, fileId, fileContent) {
|
||||
if (!this.files[projectId]) {
|
||||
this.files[projectId] = {}
|
||||
}
|
||||
this.files[projectId][fileId] = fileContent
|
||||
}
|
||||
|
||||
deleteObject(projectId, fileId) {
|
||||
if (this.files[projectId]) {
|
||||
delete this.files[projectId][fileId]
|
||||
if (Object.keys(this.files[projectId]).length === 0) {
|
||||
delete this.files[projectId]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const mockFilestore = new MockFilestore()
|
||||
|
||||
/**
|
||||
* @param {ObjectId} objectId
|
||||
* @return {string}
|
||||
|
|
|
@ -1,48 +1,24 @@
|
|||
import fs from 'node:fs'
|
||||
import Crypto from 'node:crypto'
|
||||
import Stream from 'node:stream'
|
||||
import { promisify } from 'node:util'
|
||||
import { Binary, ObjectId } from 'mongodb'
|
||||
import { Blob } from 'overleaf-editor-core'
|
||||
import { backedUpBlobs, blobs, db } from '../../../../storage/lib/mongodb.js'
|
||||
import { db } from '../../../../storage/lib/mongodb.js'
|
||||
import cleanup from './support/cleanup.js'
|
||||
import testProjects from '../api/support/test_projects.js'
|
||||
import { execFile } from 'node:child_process'
|
||||
import chai, { expect } from 'chai'
|
||||
import chaiExclude from 'chai-exclude'
|
||||
import config from 'config'
|
||||
import { WritableBuffer } from '@overleaf/stream-utils'
|
||||
import {
|
||||
backupPersistor,
|
||||
projectBlobsBucket,
|
||||
} from '../../../../storage/lib/backupPersistor.mjs'
|
||||
import projectKey from '../../../../storage/lib/project_key.js'
|
||||
import {
|
||||
BlobStore,
|
||||
makeProjectKey,
|
||||
} from '../../../../storage/lib/blob_store/index.js'
|
||||
import ObjectPersistor from '@overleaf/object-persistor'
|
||||
import { BlobStore } from '../../../../storage/lib/blob_store/index.js'
|
||||
import { mockFilestore } from './support/MockFilestore.mjs'
|
||||
|
||||
chai.use(chaiExclude)
|
||||
|
||||
const TIMEOUT = 20 * 1_000
|
||||
|
||||
const { deksBucket } = config.get('backupStore')
|
||||
const { tieringStorageClass } = config.get('backupPersistor')
|
||||
|
||||
const projectsCollection = db.collection('projects')
|
||||
const deletedProjectsCollection = db.collection('deletedProjects')
|
||||
|
||||
const FILESTORE_PERSISTOR = ObjectPersistor({
|
||||
backend: 'gcs',
|
||||
gcs: {
|
||||
endpoint: {
|
||||
apiEndpoint: process.env.GCS_API_ENDPOINT,
|
||||
projectId: process.env.GCS_PROJECT_ID,
|
||||
},
|
||||
},
|
||||
})
|
||||
|
||||
/**
|
||||
* @param {ObjectId} objectId
|
||||
* @return {string}
|
||||
|
@ -70,17 +46,6 @@ function binaryForGitBlobHash(gitBlobHash) {
|
|||
return new Binary(Buffer.from(gitBlobHash, 'hex'))
|
||||
}
|
||||
|
||||
async function listS3Bucket(bucket, wantStorageClass) {
|
||||
const client = backupPersistor._getClientForBucket(bucket)
|
||||
const response = await client.listObjectsV2({ Bucket: bucket }).promise()
|
||||
|
||||
for (const object of response.Contents || []) {
|
||||
expect(object).to.have.property('StorageClass', wantStorageClass)
|
||||
}
|
||||
|
||||
return (response.Contents || []).map(item => item.Key || '')
|
||||
}
|
||||
|
||||
function objectIdFromTime(timestamp) {
|
||||
return ObjectId.createFromTime(new Date(timestamp).getTime() / 1000)
|
||||
}
|
||||
|
@ -97,7 +62,6 @@ describe('back_fill_file_hash_fix_up script', function () {
|
|||
const historyIdDeleted0 = projectIdDeleted0.toString()
|
||||
const fileIdWithDifferentHashFound = objectIdFromTime('2017-02-01T00:00:00Z')
|
||||
const fileIdInGoodState = objectIdFromTime('2017-02-01T00:01:00Z')
|
||||
const fileIdBlobExistsInGCS0 = objectIdFromTime('2017-02-01T00:02:00Z')
|
||||
const fileIdWithDifferentHashNotFound0 = objectIdFromTime(
|
||||
'2017-02-01T00:03:00Z'
|
||||
)
|
||||
|
@ -112,9 +76,6 @@ describe('back_fill_file_hash_fix_up script', function () {
|
|||
const fileIdWithDifferentHashRestore = objectIdFromTime(
|
||||
'2017-02-01T00:08:00Z'
|
||||
)
|
||||
const fileIdBlobExistsInGCS1 = objectIdFromTime('2017-02-01T00:09:00Z')
|
||||
const fileIdRestoreFromFilestore0 = objectIdFromTime('2017-02-01T00:10:00Z')
|
||||
const fileIdRestoreFromFilestore1 = objectIdFromTime('2017-02-01T00:11:00Z')
|
||||
const fileIdMissing2 = objectIdFromTime('2017-02-01T00:12:00Z')
|
||||
const fileIdHashMissing0 = objectIdFromTime('2017-02-01T00:13:00Z')
|
||||
const fileIdHashMissing1 = objectIdFromTime('2017-02-01T00:14:00Z')
|
||||
|
@ -125,31 +86,11 @@ describe('back_fill_file_hash_fix_up script', function () {
|
|||
)
|
||||
const deleteProjectsRecordId0 = new ObjectId()
|
||||
const writtenBlobs = [
|
||||
{
|
||||
projectId: projectId0,
|
||||
historyId: historyId0,
|
||||
fileId: fileIdBlobExistsInGCS0,
|
||||
},
|
||||
{
|
||||
projectId: projectId0,
|
||||
historyId: historyId0,
|
||||
fileId: fileIdBlobExistsInGCS1,
|
||||
},
|
||||
{
|
||||
projectId: projectId0,
|
||||
historyId: historyId0,
|
||||
fileId: fileIdWithDifferentHashNotFound0,
|
||||
},
|
||||
{
|
||||
projectId: projectId0,
|
||||
historyId: historyId0,
|
||||
fileId: fileIdRestoreFromFilestore0,
|
||||
},
|
||||
{
|
||||
projectId: projectId0,
|
||||
historyId: historyId0,
|
||||
fileId: fileIdRestoreFromFilestore1,
|
||||
},
|
||||
{
|
||||
projectId: projectId0,
|
||||
historyId: historyId0,
|
||||
|
@ -200,17 +141,6 @@ describe('back_fill_file_hash_fix_up script', function () {
|
|||
},
|
||||
msg: 'failed to process file',
|
||||
},
|
||||
{
|
||||
projectId: projectId0,
|
||||
fileId: fileIdRestoreFromFilestore0,
|
||||
err: { message: 'OError: hash mismatch' },
|
||||
hash: gitBlobHash(fileIdRestoreFromFilestore0),
|
||||
entry: {
|
||||
ctx: { historyId: historyId0.toString() },
|
||||
hash: hashDoesNotExistAsBlob,
|
||||
},
|
||||
msg: 'failed to process file',
|
||||
},
|
||||
{
|
||||
projectId: projectIdDeleted0,
|
||||
fileId: fileIdWithDifferentHashNotFound1,
|
||||
|
@ -236,33 +166,6 @@ describe('back_fill_file_hash_fix_up script', function () {
|
|||
err: { message: 'NotFoundError' },
|
||||
msg: 'failed to process file',
|
||||
},
|
||||
{
|
||||
projectId: projectId0,
|
||||
fileId: fileIdBlobExistsInGCS0,
|
||||
hash: gitBlobHash(fileIdBlobExistsInGCS0),
|
||||
err: { message: 'storage.objects.delete' },
|
||||
msg: 'failed to process file',
|
||||
},
|
||||
{
|
||||
projectId: projectId0,
|
||||
fileId: fileIdBlobExistsInGCSCorrupted,
|
||||
hash: gitBlobHash(fileIdBlobExistsInGCSCorrupted),
|
||||
err: { message: 'storage.objects.delete' },
|
||||
msg: 'failed to process file',
|
||||
},
|
||||
{
|
||||
projectId: projectId0,
|
||||
fileId: fileIdBlobExistsInGCS1,
|
||||
hash: gitBlobHash(fileIdBlobExistsInGCS1),
|
||||
err: { message: 'storage.objects.delete' },
|
||||
msg: 'failed to process file',
|
||||
},
|
||||
{
|
||||
projectId: projectId0,
|
||||
fileId: fileIdRestoreFromFilestore1,
|
||||
err: { message: 'storage.objects.delete' },
|
||||
msg: 'failed to process file',
|
||||
},
|
||||
{
|
||||
projectId: projectIdDeleted0,
|
||||
fileId: fileIdMissing1,
|
||||
|
@ -291,22 +194,23 @@ describe('back_fill_file_hash_fix_up script', function () {
|
|||
reason: 'bad file hash',
|
||||
msg: 'bad file-tree path',
|
||||
},
|
||||
{
|
||||
projectId: projectId0,
|
||||
_id: fileIdBlobExistsInGCSCorrupted,
|
||||
reason: 'bad file hash',
|
||||
msg: 'bad file-tree path',
|
||||
},
|
||||
]
|
||||
if (PRINT_IDS_AND_HASHES_FOR_DEBUGGING) {
|
||||
const fileIds = {
|
||||
fileIdWithDifferentHashFound,
|
||||
fileIdInGoodState,
|
||||
fileIdBlobExistsInGCS0,
|
||||
fileIdBlobExistsInGCS1,
|
||||
fileIdWithDifferentHashNotFound0,
|
||||
fileIdWithDifferentHashNotFound1,
|
||||
fileIdBlobExistsInGCSCorrupted,
|
||||
fileIdMissing0,
|
||||
fileIdMissing1,
|
||||
fileIdMissing2,
|
||||
fileIdWithDifferentHashRestore,
|
||||
fileIdRestoreFromFilestore0,
|
||||
fileIdRestoreFromFilestore1,
|
||||
fileIdHashMissing0,
|
||||
fileIdHashMissing1,
|
||||
}
|
||||
|
@ -330,38 +234,25 @@ describe('back_fill_file_hash_fix_up script', function () {
|
|||
before(cleanup.everything)
|
||||
|
||||
before('populate blobs/GCS', async function () {
|
||||
await FILESTORE_PERSISTOR.sendStream(
|
||||
USER_FILES_BUCKET_NAME,
|
||||
`${projectId0}/${fileIdRestoreFromFilestore0}`,
|
||||
Stream.Readable.from([fileIdRestoreFromFilestore0.toString()])
|
||||
await mockFilestore.start()
|
||||
mockFilestore.addFile(
|
||||
projectId0,
|
||||
fileIdHashMissing0,
|
||||
fileIdHashMissing0.toString()
|
||||
)
|
||||
await FILESTORE_PERSISTOR.sendStream(
|
||||
USER_FILES_BUCKET_NAME,
|
||||
`${projectId0}/${fileIdRestoreFromFilestore1}`,
|
||||
Stream.Readable.from([fileIdRestoreFromFilestore1.toString()])
|
||||
mockFilestore.addFile(
|
||||
projectId0,
|
||||
fileIdHashMissing1,
|
||||
fileIdHashMissing1.toString()
|
||||
)
|
||||
await FILESTORE_PERSISTOR.sendStream(
|
||||
USER_FILES_BUCKET_NAME,
|
||||
`${projectId0}/${fileIdHashMissing0}`,
|
||||
Stream.Readable.from([fileIdHashMissing0.toString()])
|
||||
)
|
||||
await FILESTORE_PERSISTOR.sendStream(
|
||||
USER_FILES_BUCKET_NAME,
|
||||
`${projectId0}/${fileIdHashMissing1}`,
|
||||
Stream.Readable.from([fileIdHashMissing1.toString()])
|
||||
mockFilestore.addFile(
|
||||
projectId0,
|
||||
fileIdBlobExistsInGCSCorrupted,
|
||||
fileIdBlobExistsInGCSCorrupted.toString()
|
||||
)
|
||||
await new BlobStore(historyId0.toString()).putString(
|
||||
fileIdHashMissing1.toString() // partially processed
|
||||
)
|
||||
await new BlobStore(historyId0.toString()).putString(
|
||||
fileIdBlobExistsInGCS0.toString()
|
||||
)
|
||||
await new BlobStore(historyId0.toString()).putString(
|
||||
fileIdBlobExistsInGCS1.toString()
|
||||
)
|
||||
await new BlobStore(historyId0.toString()).putString(
|
||||
fileIdRestoreFromFilestore1.toString()
|
||||
)
|
||||
const path = '/tmp/test-blob-corrupted'
|
||||
try {
|
||||
await fs.promises.writeFile(path, contentCorruptedBlob)
|
||||
|
@ -426,22 +317,10 @@ describe('back_fill_file_hash_fix_up script', function () {
|
|||
_id: fileIdWithDifferentHashNotFound0,
|
||||
hash: hashDoesNotExistAsBlob,
|
||||
},
|
||||
{
|
||||
_id: fileIdRestoreFromFilestore0,
|
||||
hash: hashDoesNotExistAsBlob,
|
||||
},
|
||||
{
|
||||
_id: fileIdRestoreFromFilestore1,
|
||||
},
|
||||
{
|
||||
_id: fileIdBlobExistsInGCS0,
|
||||
hash: gitBlobHash(fileIdBlobExistsInGCS0),
|
||||
},
|
||||
{
|
||||
_id: fileIdBlobExistsInGCSCorrupted,
|
||||
hash: gitBlobHash(fileIdBlobExistsInGCSCorrupted),
|
||||
},
|
||||
{ _id: fileIdBlobExistsInGCS1 },
|
||||
],
|
||||
folders: [],
|
||||
},
|
||||
|
@ -546,8 +425,8 @@ describe('back_fill_file_hash_fix_up script', function () {
|
|||
})
|
||||
it('should print stats', function () {
|
||||
expect(stats).to.contain({
|
||||
processedLines: 16,
|
||||
success: 11,
|
||||
processedLines: 12,
|
||||
success: 7,
|
||||
alreadyProcessed: 0,
|
||||
fileDeleted: 0,
|
||||
skipped: 0,
|
||||
|
@ -558,9 +437,9 @@ describe('back_fill_file_hash_fix_up script', function () {
|
|||
it('should handle re-run on same logs', async function () {
|
||||
;({ stats } = await runScriptWithLogs())
|
||||
expect(stats).to.contain({
|
||||
processedLines: 16,
|
||||
processedLines: 12,
|
||||
success: 0,
|
||||
alreadyProcessed: 8,
|
||||
alreadyProcessed: 4,
|
||||
fileDeleted: 3,
|
||||
skipped: 0,
|
||||
failed: 3,
|
||||
|
@ -663,31 +542,11 @@ describe('back_fill_file_hash_fix_up script', function () {
|
|||
_id: fileIdWithDifferentHashNotFound0,
|
||||
hash: gitBlobHash(fileIdWithDifferentHashNotFound0),
|
||||
},
|
||||
// Updated hash
|
||||
{
|
||||
_id: fileIdRestoreFromFilestore0,
|
||||
hash: gitBlobHash(fileIdRestoreFromFilestore0),
|
||||
},
|
||||
// Added hash
|
||||
{
|
||||
_id: fileIdRestoreFromFilestore1,
|
||||
hash: gitBlobHash(fileIdRestoreFromFilestore1),
|
||||
},
|
||||
// No change, blob created
|
||||
{
|
||||
_id: fileIdBlobExistsInGCS0,
|
||||
hash: gitBlobHash(fileIdBlobExistsInGCS0),
|
||||
},
|
||||
// No change, flagged
|
||||
{
|
||||
_id: fileIdBlobExistsInGCSCorrupted,
|
||||
hash: gitBlobHash(fileIdBlobExistsInGCSCorrupted),
|
||||
},
|
||||
// Added hash
|
||||
{
|
||||
_id: fileIdBlobExistsInGCS1,
|
||||
hash: gitBlobHash(fileIdBlobExistsInGCS1),
|
||||
},
|
||||
],
|
||||
folders: [],
|
||||
},
|
||||
|
@ -696,7 +555,7 @@ describe('back_fill_file_hash_fix_up script', function () {
|
|||
],
|
||||
overleaf: { history: { id: historyId0 } },
|
||||
// Incremented when removing file/updating hash
|
||||
version: 8,
|
||||
version: 5,
|
||||
},
|
||||
])
|
||||
expect(await deletedProjectsCollection.find({}).toArray()).to.deep.equal([
|
||||
|
@ -745,62 +604,6 @@ describe('back_fill_file_hash_fix_up script', function () {
|
|||
(writtenBlobsByProject.get(projectId) || []).concat([fileId])
|
||||
)
|
||||
}
|
||||
expect(
|
||||
(await backedUpBlobs.find({}, { sort: { _id: 1 } }).toArray()).map(
|
||||
entry => {
|
||||
// blobs are pushed unordered into mongo. Sort the list for consistency.
|
||||
entry.blobs.sort()
|
||||
return entry
|
||||
}
|
||||
)
|
||||
).to.deep.equal(
|
||||
Array.from(writtenBlobsByProject.entries()).map(
|
||||
([projectId, fileIds]) => {
|
||||
return {
|
||||
_id: projectId,
|
||||
blobs: fileIds
|
||||
.map(fileId => binaryForGitBlobHash(gitBlobHash(fileId)))
|
||||
.sort(),
|
||||
}
|
||||
}
|
||||
)
|
||||
)
|
||||
})
|
||||
it('should have backed up all the files', async function () {
|
||||
expect(tieringStorageClass).to.exist
|
||||
const objects = await listS3Bucket(projectBlobsBucket, tieringStorageClass)
|
||||
expect(objects.sort()).to.deep.equal(
|
||||
writtenBlobs
|
||||
.map(({ historyId, fileId, hash }) =>
|
||||
makeProjectKey(historyId, hash || gitBlobHash(fileId))
|
||||
)
|
||||
.sort()
|
||||
)
|
||||
for (let { historyId, fileId } of writtenBlobs) {
|
||||
const hash = gitBlobHash(fileId.toString())
|
||||
const s = await backupPersistor.getObjectStream(
|
||||
projectBlobsBucket,
|
||||
makeProjectKey(historyId, hash),
|
||||
{ autoGunzip: true }
|
||||
)
|
||||
const buf = new WritableBuffer()
|
||||
await Stream.promises.pipeline(s, buf)
|
||||
expect(gitBlobHashBuffer(buf.getContents())).to.equal(hash)
|
||||
const id = buf.getContents().toString('utf-8')
|
||||
expect(id).to.equal(fileId.toString())
|
||||
// double check we are not comparing 'undefined' or '[object Object]' above
|
||||
expect(id).to.match(/^[a-f0-9]{24}$/)
|
||||
}
|
||||
const deks = await listS3Bucket(deksBucket, 'STANDARD')
|
||||
expect(deks.sort()).to.deep.equal(
|
||||
Array.from(
|
||||
new Set(
|
||||
writtenBlobs.map(
|
||||
({ historyId }) => projectKey.format(historyId) + '/dek'
|
||||
)
|
||||
)
|
||||
).sort()
|
||||
)
|
||||
})
|
||||
it('should have written the back filled files to history v1', async function () {
|
||||
for (const { historyId, fileId } of writtenBlobs) {
|
||||
|
|
|
@ -0,0 +1,54 @@
|
|||
import express from 'express'
|
||||
|
||||
class MockFilestore {
|
||||
constructor() {
|
||||
this.host = process.env.FILESTORE_HOST || '127.0.0.1'
|
||||
this.port = process.env.FILESTORE_PORT || 3009
|
||||
// create a server listening on this.host and this.port
|
||||
this.files = {}
|
||||
|
||||
this.app = express()
|
||||
|
||||
this.app.get('/project/:projectId/file/:fileId', (req, res) => {
|
||||
const { projectId, fileId } = req.params
|
||||
const content = this.files[projectId]?.[fileId]
|
||||
if (!content) return res.status(404).end()
|
||||
res.status(200).end(content)
|
||||
})
|
||||
}
|
||||
|
||||
start() {
|
||||
// reset stored files
|
||||
this.files = {}
|
||||
// start the server
|
||||
if (this.serverPromise) {
|
||||
return this.serverPromise
|
||||
} else {
|
||||
this.serverPromise = new Promise((resolve, reject) => {
|
||||
this.server = this.app.listen(this.port, this.host, err => {
|
||||
if (err) return reject(err)
|
||||
resolve()
|
||||
})
|
||||
})
|
||||
return this.serverPromise
|
||||
}
|
||||
}
|
||||
|
||||
addFile(projectId, fileId, fileContent) {
|
||||
if (!this.files[projectId]) {
|
||||
this.files[projectId] = {}
|
||||
}
|
||||
this.files[projectId][fileId] = fileContent
|
||||
}
|
||||
|
||||
deleteObject(projectId, fileId) {
|
||||
if (this.files[projectId]) {
|
||||
delete this.files[projectId][fileId]
|
||||
if (Object.keys(this.files[projectId]).length === 0) {
|
||||
delete this.files[projectId]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export const mockFilestore = new MockFilestore()
|
|
@ -42,7 +42,7 @@ services:
|
|||
command: tar -czf /tmp/build/build.tar.gz --exclude=build.tar.gz --exclude-vcs .
|
||||
user: root
|
||||
mongo:
|
||||
image: mongo:7.0.20
|
||||
image: mongo:8.0.11
|
||||
command: --replSet overleaf
|
||||
volumes:
|
||||
- ../../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
|
|
|
@ -44,7 +44,7 @@ services:
|
|||
command: npm run --silent test:acceptance
|
||||
|
||||
mongo:
|
||||
image: mongo:7.0.20
|
||||
image: mongo:8.0.11
|
||||
command: --replSet overleaf
|
||||
volumes:
|
||||
- ../../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
|
|
|
@ -55,7 +55,7 @@ services:
|
|||
retries: 20
|
||||
|
||||
mongo:
|
||||
image: mongo:7.0.20
|
||||
image: mongo:8.0.11
|
||||
command: --replSet overleaf
|
||||
volumes:
|
||||
- ../../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
|
|
|
@ -57,7 +57,7 @@ services:
|
|||
retries: 20
|
||||
|
||||
mongo:
|
||||
image: mongo:7.0.20
|
||||
image: mongo:8.0.11
|
||||
command: --replSet overleaf
|
||||
volumes:
|
||||
- ../../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
|
|
6
services/references/.eslintrc
Normal file
6
services/references/.eslintrc
Normal file
|
@ -0,0 +1,6 @@
|
|||
{
|
||||
"parserOptions": {
|
||||
"ecmaVersion": 2022,
|
||||
"sourceType": "module"
|
||||
}
|
||||
}
|
5
services/references/.gitignore
vendored
Normal file
5
services/references/.gitignore
vendored
Normal file
|
@ -0,0 +1,5 @@
|
|||
node_modules
|
||||
forever
|
||||
|
||||
# managed by dev-environment$ bin/update_build_scripts
|
||||
.npmrc
|
3
services/references/.mocharc.json
Normal file
3
services/references/.mocharc.json
Normal file
|
@ -0,0 +1,3 @@
|
|||
{
|
||||
"require": "test/setup.js"
|
||||
}
|
1
services/references/.nvmrc
Normal file
1
services/references/.nvmrc
Normal file
|
@ -0,0 +1 @@
|
|||
20.18.2
|
27
services/references/Dockerfile
Normal file
27
services/references/Dockerfile
Normal file
|
@ -0,0 +1,27 @@
|
|||
# This file was auto-generated, do not edit it directly.
|
||||
# Instead run bin/update_build_scripts from
|
||||
# https://github.com/overleaf/internal/
|
||||
|
||||
FROM node:20.18.2 AS base
|
||||
|
||||
WORKDIR /overleaf/services/references
|
||||
|
||||
# Google Cloud Storage needs a writable $HOME/.config for resumable uploads
|
||||
# (see https://googleapis.dev/nodejs/storage/latest/File.html#createWriteStream)
|
||||
RUN mkdir /home/node/.config && chown node:node /home/node/.config
|
||||
|
||||
FROM base AS app
|
||||
|
||||
COPY package.json package-lock.json /overleaf/
|
||||
COPY services/references/package.json /overleaf/services/references/
|
||||
COPY libraries/ /overleaf/libraries/
|
||||
COPY patches/ /overleaf/patches/
|
||||
|
||||
RUN cd /overleaf && npm ci --quiet
|
||||
|
||||
COPY services/references/ /overleaf/services/references/
|
||||
|
||||
FROM app
|
||||
USER node
|
||||
|
||||
CMD ["node", "--expose-gc", "app.js"]
|
662
services/references/LICENSE
Normal file
662
services/references/LICENSE
Normal file
|
@ -0,0 +1,662 @@
|
|||
|
||||
GNU AFFERO GENERAL PUBLIC LICENSE
|
||||
Version 3, 19 November 2007
|
||||
|
||||
Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
|
||||
Everyone is permitted to copy and distribute verbatim copies
|
||||
of this license document, but changing it is not allowed.
|
||||
|
||||
Preamble
|
||||
|
||||
The GNU Affero General Public License is a free, copyleft license for
|
||||
software and other kinds of works, specifically designed to ensure
|
||||
cooperation with the community in the case of network server software.
|
||||
|
||||
The licenses for most software and other practical works are designed
|
||||
to take away your freedom to share and change the works. By contrast,
|
||||
our General Public Licenses are intended to guarantee your freedom to
|
||||
share and change all versions of a program--to make sure it remains free
|
||||
software for all its users.
|
||||
|
||||
When we speak of free software, we are referring to freedom, not
|
||||
price. Our General Public Licenses are designed to make sure that you
|
||||
have the freedom to distribute copies of free software (and charge for
|
||||
them if you wish), that you receive source code or can get it if you
|
||||
want it, that you can change the software or use pieces of it in new
|
||||
free programs, and that you know you can do these things.
|
||||
|
||||
Developers that use our General Public Licenses protect your rights
|
||||
with two steps: (1) assert copyright on the software, and (2) offer
|
||||
you this License which gives you legal permission to copy, distribute
|
||||
and/or modify the software.
|
||||
|
||||
A secondary benefit of defending all users' freedom is that
|
||||
improvements made in alternate versions of the program, if they
|
||||
receive widespread use, become available for other developers to
|
||||
incorporate. Many developers of free software are heartened and
|
||||
encouraged by the resulting cooperation. However, in the case of
|
||||
software used on network servers, this result may fail to come about.
|
||||
The GNU General Public License permits making a modified version and
|
||||
letting the public access it on a server without ever releasing its
|
||||
source code to the public.
|
||||
|
||||
The GNU Affero General Public License is designed specifically to
|
||||
ensure that, in such cases, the modified source code becomes available
|
||||
to the community. It requires the operator of a network server to
|
||||
provide the source code of the modified version running there to the
|
||||
users of that server. Therefore, public use of a modified version, on
|
||||
a publicly accessible server, gives the public access to the source
|
||||
code of the modified version.
|
||||
|
||||
An older license, called the Affero General Public License and
|
||||
published by Affero, was designed to accomplish similar goals. This is
|
||||
a different license, not a version of the Affero GPL, but Affero has
|
||||
released a new version of the Affero GPL which permits relicensing under
|
||||
this license.
|
||||
|
||||
The precise terms and conditions for copying, distribution and
|
||||
modification follow.
|
||||
|
||||
TERMS AND CONDITIONS
|
||||
|
||||
0. Definitions.
|
||||
|
||||
"This License" refers to version 3 of the GNU Affero General Public License.
|
||||
|
||||
"Copyright" also means copyright-like laws that apply to other kinds of
|
||||
works, such as semiconductor masks.
|
||||
|
||||
"The Program" refers to any copyrightable work licensed under this
|
||||
License. Each licensee is addressed as "you". "Licensees" and
|
||||
"recipients" may be individuals or organizations.
|
||||
|
||||
To "modify" a work means to copy from or adapt all or part of the work
|
||||
in a fashion requiring copyright permission, other than the making of an
|
||||
exact copy. The resulting work is called a "modified version" of the
|
||||
earlier work or a work "based on" the earlier work.
|
||||
|
||||
A "covered work" means either the unmodified Program or a work based
|
||||
on the Program.
|
||||
|
||||
To "propagate" a work means to do anything with it that, without
|
||||
permission, would make you directly or secondarily liable for
|
||||
infringement under applicable copyright law, except executing it on a
|
||||
computer or modifying a private copy. Propagation includes copying,
|
||||
distribution (with or without modification), making available to the
|
||||
public, and in some countries other activities as well.
|
||||
|
||||
To "convey" a work means any kind of propagation that enables other
|
||||
parties to make or receive copies. Mere interaction with a user through
|
||||
a computer network, with no transfer of a copy, is not conveying.
|
||||
|
||||
An interactive user interface displays "Appropriate Legal Notices"
|
||||
to the extent that it includes a convenient and prominently visible
|
||||
feature that (1) displays an appropriate copyright notice, and (2)
|
||||
tells the user that there is no warranty for the work (except to the
|
||||
extent that warranties are provided), that licensees may convey the
|
||||
work under this License, and how to view a copy of this License. If
|
||||
the interface presents a list of user commands or options, such as a
|
||||
menu, a prominent item in the list meets this criterion.
|
||||
|
||||
1. Source Code.
|
||||
|
||||
The "source code" for a work means the preferred form of the work
|
||||
for making modifications to it. "Object code" means any non-source
|
||||
form of a work.
|
||||
|
||||
A "Standard Interface" means an interface that either is an official
|
||||
standard defined by a recognized standards body, or, in the case of
|
||||
interfaces specified for a particular programming language, one that
|
||||
is widely used among developers working in that language.
|
||||
|
||||
The "System Libraries" of an executable work include anything, other
|
||||
than the work as a whole, that (a) is included in the normal form of
|
||||
packaging a Major Component, but which is not part of that Major
|
||||
Component, and (b) serves only to enable use of the work with that
|
||||
Major Component, or to implement a Standard Interface for which an
|
||||
implementation is available to the public in source code form. A
|
||||
"Major Component", in this context, means a major essential component
|
||||
(kernel, window system, and so on) of the specific operating system
|
||||
(if any) on which the executable work runs, or a compiler used to
|
||||
produce the work, or an object code interpreter used to run it.
|
||||
|
||||
The "Corresponding Source" for a work in object code form means all
|
||||
the source code needed to generate, install, and (for an executable
|
||||
work) run the object code and to modify the work, including scripts to
|
||||
control those activities. However, it does not include the work's
|
||||
System Libraries, or general-purpose tools or generally available free
|
||||
programs which are used unmodified in performing those activities but
|
||||
which are not part of the work. For example, Corresponding Source
|
||||
includes interface definition files associated with source files for
|
||||
the work, and the source code for shared libraries and dynamically
|
||||
linked subprograms that the work is specifically designed to require,
|
||||
such as by intimate data communication or control flow between those
|
||||
subprograms and other parts of the work.
|
||||
|
||||
The Corresponding Source need not include anything that users
|
||||
can regenerate automatically from other parts of the Corresponding
|
||||
Source.
|
||||
|
||||
The Corresponding Source for a work in source code form is that
|
||||
same work.
|
||||
|
||||
2. Basic Permissions.
|
||||
|
||||
All rights granted under this License are granted for the term of
|
||||
copyright on the Program, and are irrevocable provided the stated
|
||||
conditions are met. This License explicitly affirms your unlimited
|
||||
permission to run the unmodified Program. The output from running a
|
||||
covered work is covered by this License only if the output, given its
|
||||
content, constitutes a covered work. This License acknowledges your
|
||||
rights of fair use or other equivalent, as provided by copyright law.
|
||||
|
||||
You may make, run and propagate covered works that you do not
|
||||
convey, without conditions so long as your license otherwise remains
|
||||
in force. You may convey covered works to others for the sole purpose
|
||||
of having them make modifications exclusively for you, or provide you
|
||||
with facilities for running those works, provided that you comply with
|
||||
the terms of this License in conveying all material for which you do
|
||||
not control copyright. Those thus making or running the covered works
|
||||
for you must do so exclusively on your behalf, under your direction
|
||||
and control, on terms that prohibit them from making any copies of
|
||||
your copyrighted material outside their relationship with you.
|
||||
|
||||
Conveying under any other circumstances is permitted solely under
|
||||
the conditions stated below. Sublicensing is not allowed; section 10
|
||||
makes it unnecessary.
|
||||
|
||||
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
|
||||
|
||||
No covered work shall be deemed part of an effective technological
|
||||
measure under any applicable law fulfilling obligations under article
|
||||
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
||||
similar laws prohibiting or restricting circumvention of such
|
||||
measures.
|
||||
|
||||
When you convey a covered work, you waive any legal power to forbid
|
||||
circumvention of technological measures to the extent such circumvention
|
||||
is effected by exercising rights under this License with respect to
|
||||
the covered work, and you disclaim any intention to limit operation or
|
||||
modification of the work as a means of enforcing, against the work's
|
||||
users, your or third parties' legal rights to forbid circumvention of
|
||||
technological measures.
|
||||
|
||||
4. Conveying Verbatim Copies.
|
||||
|
||||
You may convey verbatim copies of the Program's source code as you
|
||||
receive it, in any medium, provided that you conspicuously and
|
||||
appropriately publish on each copy an appropriate copyright notice;
|
||||
keep intact all notices stating that this License and any
|
||||
non-permissive terms added in accord with section 7 apply to the code;
|
||||
keep intact all notices of the absence of any warranty; and give all
|
||||
recipients a copy of this License along with the Program.
|
||||
|
||||
You may charge any price or no price for each copy that you convey,
|
||||
and you may offer support or warranty protection for a fee.
|
||||
|
||||
5. Conveying Modified Source Versions.
|
||||
|
||||
You may convey a work based on the Program, or the modifications to
|
||||
produce it from the Program, in the form of source code under the
|
||||
terms of section 4, provided that you also meet all of these conditions:
|
||||
|
||||
a) The work must carry prominent notices stating that you modified
|
||||
it, and giving a relevant date.
|
||||
|
||||
b) The work must carry prominent notices stating that it is
|
||||
released under this License and any conditions added under section
|
||||
7. This requirement modifies the requirement in section 4 to
|
||||
"keep intact all notices".
|
||||
|
||||
c) You must license the entire work, as a whole, under this
|
||||
License to anyone who comes into possession of a copy. This
|
||||
License will therefore apply, along with any applicable section 7
|
||||
additional terms, to the whole of the work, and all its parts,
|
||||
regardless of how they are packaged. This License gives no
|
||||
permission to license the work in any other way, but it does not
|
||||
invalidate such permission if you have separately received it.
|
||||
|
||||
d) If the work has interactive user interfaces, each must display
|
||||
Appropriate Legal Notices; however, if the Program has interactive
|
||||
interfaces that do not display Appropriate Legal Notices, your
|
||||
work need not make them do so.
|
||||
|
||||
A compilation of a covered work with other separate and independent
|
||||
works, which are not by their nature extensions of the covered work,
|
||||
and which are not combined with it such as to form a larger program,
|
||||
in or on a volume of a storage or distribution medium, is called an
|
||||
"aggregate" if the compilation and its resulting copyright are not
|
||||
used to limit the access or legal rights of the compilation's users
|
||||
beyond what the individual works permit. Inclusion of a covered work
|
||||
in an aggregate does not cause this License to apply to the other
|
||||
parts of the aggregate.
|
||||
|
||||
6. Conveying Non-Source Forms.
|
||||
|
||||
You may convey a covered work in object code form under the terms
|
||||
of sections 4 and 5, provided that you also convey the
|
||||
machine-readable Corresponding Source under the terms of this License,
|
||||
in one of these ways:
|
||||
|
||||
a) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by the
|
||||
Corresponding Source fixed on a durable physical medium
|
||||
customarily used for software interchange.
|
||||
|
||||
b) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by a
|
||||
written offer, valid for at least three years and valid for as
|
||||
long as you offer spare parts or customer support for that product
|
||||
model, to give anyone who possesses the object code either (1) a
|
||||
copy of the Corresponding Source for all the software in the
|
||||
product that is covered by this License, on a durable physical
|
||||
medium customarily used for software interchange, for a price no
|
||||
more than your reasonable cost of physically performing this
|
||||
conveying of source, or (2) access to copy the
|
||||
Corresponding Source from a network server at no charge.
|
||||
|
||||
c) Convey individual copies of the object code with a copy of the
|
||||
written offer to provide the Corresponding Source. This
|
||||
alternative is allowed only occasionally and noncommercially, and
|
||||
only if you received the object code with such an offer, in accord
|
||||
with subsection 6b.
|
||||
|
||||
d) Convey the object code by offering access from a designated
|
||||
place (gratis or for a charge), and offer equivalent access to the
|
||||
Corresponding Source in the same way through the same place at no
|
||||
further charge. You need not require recipients to copy the
|
||||
Corresponding Source along with the object code. If the place to
|
||||
copy the object code is a network server, the Corresponding Source
|
||||
may be on a different server (operated by you or a third party)
|
||||
that supports equivalent copying facilities, provided you maintain
|
||||
clear directions next to the object code saying where to find the
|
||||
Corresponding Source. Regardless of what server hosts the
|
||||
Corresponding Source, you remain obligated to ensure that it is
|
||||
available for as long as needed to satisfy these requirements.
|
||||
|
||||
e) Convey the object code using peer-to-peer transmission, provided
|
||||
you inform other peers where the object code and Corresponding
|
||||
Source of the work are being offered to the general public at no
|
||||
charge under subsection 6d.
|
||||
|
||||
A separable portion of the object code, whose source code is excluded
|
||||
from the Corresponding Source as a System Library, need not be
|
||||
included in conveying the object code work.
|
||||
|
||||
A "User Product" is either (1) a "consumer product", which means any
|
||||
tangible personal property which is normally used for personal, family,
|
||||
or household purposes, or (2) anything designed or sold for incorporation
|
||||
into a dwelling. In determining whether a product is a consumer product,
|
||||
doubtful cases shall be resolved in favor of coverage. For a particular
|
||||
product received by a particular user, "normally used" refers to a
|
||||
typical or common use of that class of product, regardless of the status
|
||||
of the particular user or of the way in which the particular user
|
||||
actually uses, or expects or is expected to use, the product. A product
|
||||
is a consumer product regardless of whether the product has substantial
|
||||
commercial, industrial or non-consumer uses, unless such uses represent
|
||||
the only significant mode of use of the product.
|
||||
|
||||
"Installation Information" for a User Product means any methods,
|
||||
procedures, authorization keys, or other information required to install
|
||||
and execute modified versions of a covered work in that User Product from
|
||||
a modified version of its Corresponding Source. The information must
|
||||
suffice to ensure that the continued functioning of the modified object
|
||||
code is in no case prevented or interfered with solely because
|
||||
modification has been made.
|
||||
|
||||
If you convey an object code work under this section in, or with, or
|
||||
specifically for use in, a User Product, and the conveying occurs as
|
||||
part of a transaction in which the right of possession and use of the
|
||||
User Product is transferred to the recipient in perpetuity or for a
|
||||
fixed term (regardless of how the transaction is characterized), the
|
||||
Corresponding Source conveyed under this section must be accompanied
|
||||
by the Installation Information. But this requirement does not apply
|
||||
if neither you nor any third party retains the ability to install
|
||||
modified object code on the User Product (for example, the work has
|
||||
been installed in ROM).
|
||||
|
||||
The requirement to provide Installation Information does not include a
|
||||
requirement to continue to provide support service, warranty, or updates
|
||||
for a work that has been modified or installed by the recipient, or for
|
||||
the User Product in which it has been modified or installed. Access to a
|
||||
network may be denied when the modification itself materially and
|
||||
adversely affects the operation of the network or violates the rules and
|
||||
protocols for communication across the network.
|
||||
|
||||
Corresponding Source conveyed, and Installation Information provided,
|
||||
in accord with this section must be in a format that is publicly
|
||||
documented (and with an implementation available to the public in
|
||||
source code form), and must require no special password or key for
|
||||
unpacking, reading or copying.
|
||||
|
||||
7. Additional Terms.
|
||||
|
||||
"Additional permissions" are terms that supplement the terms of this
|
||||
License by making exceptions from one or more of its conditions.
|
||||
Additional permissions that are applicable to the entire Program shall
|
||||
be treated as though they were included in this License, to the extent
|
||||
that they are valid under applicable law. If additional permissions
|
||||
apply only to part of the Program, that part may be used separately
|
||||
under those permissions, but the entire Program remains governed by
|
||||
this License without regard to the additional permissions.
|
||||
|
||||
When you convey a copy of a covered work, you may at your option
|
||||
remove any additional permissions from that copy, or from any part of
|
||||
it. (Additional permissions may be written to require their own
|
||||
removal in certain cases when you modify the work.) You may place
|
||||
additional permissions on material, added by you to a covered work,
|
||||
for which you have or can give appropriate copyright permission.
|
||||
|
||||
Notwithstanding any other provision of this License, for material you
|
||||
add to a covered work, you may (if authorized by the copyright holders of
|
||||
that material) supplement the terms of this License with terms:
|
||||
|
||||
a) Disclaiming warranty or limiting liability differently from the
|
||||
terms of sections 15 and 16 of this License; or
|
||||
|
||||
b) Requiring preservation of specified reasonable legal notices or
|
||||
author attributions in that material or in the Appropriate Legal
|
||||
Notices displayed by works containing it; or
|
||||
|
||||
c) Prohibiting misrepresentation of the origin of that material, or
|
||||
requiring that modified versions of such material be marked in
|
||||
reasonable ways as different from the original version; or
|
||||
|
||||
d) Limiting the use for publicity purposes of names of licensors or
|
||||
authors of the material; or
|
||||
|
||||
e) Declining to grant rights under trademark law for use of some
|
||||
trade names, trademarks, or service marks; or
|
||||
|
||||
f) Requiring indemnification of licensors and authors of that
|
||||
material by anyone who conveys the material (or modified versions of
|
||||
it) with contractual assumptions of liability to the recipient, for
|
||||
any liability that these contractual assumptions directly impose on
|
||||
those licensors and authors.
|
||||
|
||||
All other non-permissive additional terms are considered "further
|
||||
restrictions" within the meaning of section 10. If the Program as you
|
||||
received it, or any part of it, contains a notice stating that it is
|
||||
governed by this License along with a term that is a further
|
||||
restriction, you may remove that term. If a license document contains
|
||||
a further restriction but permits relicensing or conveying under this
|
||||
License, you may add to a covered work material governed by the terms
|
||||
of that license document, provided that the further restriction does
|
||||
not survive such relicensing or conveying.
|
||||
|
||||
If you add terms to a covered work in accord with this section, you
|
||||
must place, in the relevant source files, a statement of the
|
||||
additional terms that apply to those files, or a notice indicating
|
||||
where to find the applicable terms.
|
||||
|
||||
Additional terms, permissive or non-permissive, may be stated in the
|
||||
form of a separately written license, or stated as exceptions;
|
||||
the above requirements apply either way.
|
||||
|
||||
8. Termination.
|
||||
|
||||
You may not propagate or modify a covered work except as expressly
|
||||
provided under this License. Any attempt otherwise to propagate or
|
||||
modify it is void, and will automatically terminate your rights under
|
||||
this License (including any patent licenses granted under the third
|
||||
paragraph of section 11).
|
||||
|
||||
However, if you cease all violation of this License, then your
|
||||
license from a particular copyright holder is reinstated (a)
|
||||
provisionally, unless and until the copyright holder explicitly and
|
||||
finally terminates your license, and (b) permanently, if the copyright
|
||||
holder fails to notify you of the violation by some reasonable means
|
||||
prior to 60 days after the cessation.
|
||||
|
||||
Moreover, your license from a particular copyright holder is
|
||||
reinstated permanently if the copyright holder notifies you of the
|
||||
violation by some reasonable means, this is the first time you have
|
||||
received notice of violation of this License (for any work) from that
|
||||
copyright holder, and you cure the violation prior to 30 days after
|
||||
your receipt of the notice.
|
||||
|
||||
Termination of your rights under this section does not terminate the
|
||||
licenses of parties who have received copies or rights from you under
|
||||
this License. If your rights have been terminated and not permanently
|
||||
reinstated, you do not qualify to receive new licenses for the same
|
||||
material under section 10.
|
||||
|
||||
9. Acceptance Not Required for Having Copies.
|
||||
|
||||
You are not required to accept this License in order to receive or
|
||||
run a copy of the Program. Ancillary propagation of a covered work
|
||||
occurring solely as a consequence of using peer-to-peer transmission
|
||||
to receive a copy likewise does not require acceptance. However,
|
||||
nothing other than this License grants you permission to propagate or
|
||||
modify any covered work. These actions infringe copyright if you do
|
||||
not accept this License. Therefore, by modifying or propagating a
|
||||
covered work, you indicate your acceptance of this License to do so.
|
||||
|
||||
10. Automatic Licensing of Downstream Recipients.
|
||||
|
||||
Each time you convey a covered work, the recipient automatically
|
||||
receives a license from the original licensors, to run, modify and
|
||||
propagate that work, subject to this License. You are not responsible
|
||||
for enforcing compliance by third parties with this License.
|
||||
|
||||
An "entity transaction" is a transaction transferring control of an
|
||||
organization, or substantially all assets of one, or subdividing an
|
||||
organization, or merging organizations. If propagation of a covered
|
||||
work results from an entity transaction, each party to that
|
||||
transaction who receives a copy of the work also receives whatever
|
||||
licenses to the work the party's predecessor in interest had or could
|
||||
give under the previous paragraph, plus a right to possession of the
|
||||
Corresponding Source of the work from the predecessor in interest, if
|
||||
the predecessor has it or can get it with reasonable efforts.
|
||||
|
||||
You may not impose any further restrictions on the exercise of the
|
||||
rights granted or affirmed under this License. For example, you may
|
||||
not impose a license fee, royalty, or other charge for exercise of
|
||||
rights granted under this License, and you may not initiate litigation
|
||||
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
||||
any patent claim is infringed by making, using, selling, offering for
|
||||
sale, or importing the Program or any portion of it.
|
||||
|
||||
11. Patents.
|
||||
|
||||
A "contributor" is a copyright holder who authorizes use under this
|
||||
License of the Program or a work on which the Program is based. The
|
||||
work thus licensed is called the contributor's "contributor version".
|
||||
|
||||
A contributor's "essential patent claims" are all patent claims
|
||||
owned or controlled by the contributor, whether already acquired or
|
||||
hereafter acquired, that would be infringed by some manner, permitted
|
||||
by this License, of making, using, or selling its contributor version,
|
||||
but do not include claims that would be infringed only as a
|
||||
consequence of further modification of the contributor version. For
|
||||
purposes of this definition, "control" includes the right to grant
|
||||
patent sublicenses in a manner consistent with the requirements of
|
||||
this License.
|
||||
|
||||
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
||||
patent license under the contributor's essential patent claims, to
|
||||
make, use, sell, offer for sale, import and otherwise run, modify and
|
||||
propagate the contents of its contributor version.
|
||||
|
||||
In the following three paragraphs, a "patent license" is any express
|
||||
agreement or commitment, however denominated, not to enforce a patent
|
||||
(such as an express permission to practice a patent or covenant not to
|
||||
sue for patent infringement). To "grant" such a patent license to a
|
||||
party means to make such an agreement or commitment not to enforce a
|
||||
patent against the party.
|
||||
|
||||
If you convey a covered work, knowingly relying on a patent license,
|
||||
and the Corresponding Source of the work is not available for anyone
|
||||
to copy, free of charge and under the terms of this License, through a
|
||||
publicly available network server or other readily accessible means,
|
||||
then you must either (1) cause the Corresponding Source to be so
|
||||
available, or (2) arrange to deprive yourself of the benefit of the
|
||||
patent license for this particular work, or (3) arrange, in a manner
|
||||
consistent with the requirements of this License, to extend the patent
|
||||
license to downstream recipients. "Knowingly relying" means you have
|
||||
actual knowledge that, but for the patent license, your conveying the
|
||||
covered work in a country, or your recipient's use of the covered work
|
||||
in a country, would infringe one or more identifiable patents in that
|
||||
country that you have reason to believe are valid.
|
||||
|
||||
If, pursuant to or in connection with a single transaction or
|
||||
arrangement, you convey, or propagate by procuring conveyance of, a
|
||||
covered work, and grant a patent license to some of the parties
|
||||
receiving the covered work authorizing them to use, propagate, modify
|
||||
or convey a specific copy of the covered work, then the patent license
|
||||
you grant is automatically extended to all recipients of the covered
|
||||
work and works based on it.
|
||||
|
||||
A patent license is "discriminatory" if it does not include within
|
||||
the scope of its coverage, prohibits the exercise of, or is
|
||||
conditioned on the non-exercise of one or more of the rights that are
|
||||
specifically granted under this License. You may not convey a covered
|
||||
work if you are a party to an arrangement with a third party that is
|
||||
in the business of distributing software, under which you make payment
|
||||
to the third party based on the extent of your activity of conveying
|
||||
the work, and under which the third party grants, to any of the
|
||||
parties who would receive the covered work from you, a discriminatory
|
||||
patent license (a) in connection with copies of the covered work
|
||||
conveyed by you (or copies made from those copies), or (b) primarily
|
||||
for and in connection with specific products or compilations that
|
||||
contain the covered work, unless you entered into that arrangement,
|
||||
or that patent license was granted, prior to 28 March 2007.
|
||||
|
||||
Nothing in this License shall be construed as excluding or limiting
|
||||
any implied license or other defenses to infringement that may
|
||||
otherwise be available to you under applicable patent law.
|
||||
|
||||
12. No Surrender of Others' Freedom.
|
||||
|
||||
If conditions are imposed on you (whether by court order, agreement or
|
||||
otherwise) that contradict the conditions of this License, they do not
|
||||
excuse you from the conditions of this License. If you cannot convey a
|
||||
covered work so as to satisfy simultaneously your obligations under this
|
||||
License and any other pertinent obligations, then as a consequence you may
|
||||
not convey it at all. For example, if you agree to terms that obligate you
|
||||
to collect a royalty for further conveying from those to whom you convey
|
||||
the Program, the only way you could satisfy both those terms and this
|
||||
License would be to refrain entirely from conveying the Program.
|
||||
|
||||
13. Remote Network Interaction; Use with the GNU General Public License.
|
||||
|
||||
Notwithstanding any other provision of this License, if you modify the
|
||||
Program, your modified version must prominently offer all users
|
||||
interacting with it remotely through a computer network (if your version
|
||||
supports such interaction) an opportunity to receive the Corresponding
|
||||
Source of your version by providing access to the Corresponding Source
|
||||
from a network server at no charge, through some standard or customary
|
||||
means of facilitating copying of software. This Corresponding Source
|
||||
shall include the Corresponding Source for any work covered by version 3
|
||||
of the GNU General Public License that is incorporated pursuant to the
|
||||
following paragraph.
|
||||
|
||||
Notwithstanding any other provision of this License, you have
|
||||
permission to link or combine any covered work with a work licensed
|
||||
under version 3 of the GNU General Public License into a single
|
||||
combined work, and to convey the resulting work. The terms of this
|
||||
License will continue to apply to the part which is the covered work,
|
||||
but the work with which it is combined will remain governed by version
|
||||
3 of the GNU General Public License.
|
||||
|
||||
14. Revised Versions of this License.
|
||||
|
||||
The Free Software Foundation may publish revised and/or new versions of
|
||||
the GNU Affero General Public License from time to time. Such new versions
|
||||
will be similar in spirit to the present version, but may differ in detail to
|
||||
address new problems or concerns.
|
||||
|
||||
Each version is given a distinguishing version number. If the
|
||||
Program specifies that a certain numbered version of the GNU Affero General
|
||||
Public License "or any later version" applies to it, you have the
|
||||
option of following the terms and conditions either of that numbered
|
||||
version or of any later version published by the Free Software
|
||||
Foundation. If the Program does not specify a version number of the
|
||||
GNU Affero General Public License, you may choose any version ever published
|
||||
by the Free Software Foundation.
|
||||
|
||||
If the Program specifies that a proxy can decide which future
|
||||
versions of the GNU Affero General Public License can be used, that proxy's
|
||||
public statement of acceptance of a version permanently authorizes you
|
||||
to choose that version for the Program.
|
||||
|
||||
Later license versions may give you additional or different
|
||||
permissions. However, no additional obligations are imposed on any
|
||||
author or copyright holder as a result of your choosing to follow a
|
||||
later version.
|
||||
|
||||
15. Disclaimer of Warranty.
|
||||
|
||||
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
||||
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
||||
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
|
||||
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
|
||||
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
|
||||
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
|
||||
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
||||
|
||||
16. Limitation of Liability.
|
||||
|
||||
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
|
||||
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
|
||||
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
|
||||
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
|
||||
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
|
||||
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
|
||||
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
|
||||
SUCH DAMAGES.
|
||||
|
||||
17. Interpretation of Sections 15 and 16.
|
||||
|
||||
If the disclaimer of warranty and limitation of liability provided
|
||||
above cannot be given local legal effect according to their terms,
|
||||
reviewing courts shall apply local law that most closely approximates
|
||||
an absolute waiver of all civil liability in connection with the
|
||||
Program, unless a warranty or assumption of liability accompanies a
|
||||
copy of the Program in return for a fee.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
How to Apply These Terms to Your New Programs
|
||||
|
||||
If you develop a new program, and you want it to be of the greatest
|
||||
possible use to the public, the best way to achieve this is to make it
|
||||
free software which everyone can redistribute and change under these terms.
|
||||
|
||||
To do so, attach the following notices to the program. It is safest
|
||||
to attach them to the start of each source file to most effectively
|
||||
state the exclusion of warranty; and each file should have at least
|
||||
the "copyright" line and a pointer to where the full notice is found.
|
||||
|
||||
<one line to give the program's name and a brief idea of what it does.>
|
||||
Copyright (C) <year> <name of author>
|
||||
|
||||
This program is free software: you can redistribute it and/or modify
|
||||
it under the terms of the GNU Affero General Public License as published by
|
||||
the Free Software Foundation, either version 3 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU Affero General Public License for more details.
|
||||
|
||||
You should have received a copy of the GNU Affero General Public License
|
||||
along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
Also add information on how to contact you by electronic and paper mail.
|
||||
|
||||
If your software can interact with users remotely through a computer
|
||||
network, you should also make sure that it provides a way for users to
|
||||
get its source. For example, if your program is a web application, its
|
||||
interface could display a "Source" link that leads users to an archive
|
||||
of the code. There are many ways you could offer source, and different
|
||||
solutions will be better for different programs; see section 13 for the
|
||||
specific requirements.
|
||||
|
||||
You should also get your employer (if you work as a programmer) or school,
|
||||
if any, to sign a "copyright disclaimer" for the program, if necessary.
|
||||
For more information on this, and how to apply and follow the GNU AGPL, see
|
||||
<http://www.gnu.org/licenses/>.
|
156
services/references/Makefile
Normal file
156
services/references/Makefile
Normal file
|
@ -0,0 +1,156 @@
|
|||
# This file was auto-generated, do not edit it directly.
|
||||
# Instead run bin/update_build_scripts from
|
||||
# https://github.com/overleaf/internal/
|
||||
|
||||
BUILD_NUMBER ?= local
|
||||
BRANCH_NAME ?= $(shell git rev-parse --abbrev-ref HEAD)
|
||||
PROJECT_NAME = references
|
||||
BUILD_DIR_NAME = $(shell pwd | xargs basename | tr -cd '[a-zA-Z0-9_.\-]')
|
||||
|
||||
DOCKER_COMPOSE_FLAGS ?= -f docker-compose.yml
|
||||
DOCKER_COMPOSE := BUILD_NUMBER=$(BUILD_NUMBER) \
|
||||
BRANCH_NAME=$(BRANCH_NAME) \
|
||||
PROJECT_NAME=$(PROJECT_NAME) \
|
||||
MOCHA_GREP=${MOCHA_GREP} \
|
||||
docker compose ${DOCKER_COMPOSE_FLAGS}
|
||||
|
||||
COMPOSE_PROJECT_NAME_TEST_ACCEPTANCE ?= test_acceptance_$(BUILD_DIR_NAME)
|
||||
DOCKER_COMPOSE_TEST_ACCEPTANCE = \
|
||||
COMPOSE_PROJECT_NAME=$(COMPOSE_PROJECT_NAME_TEST_ACCEPTANCE) $(DOCKER_COMPOSE)
|
||||
|
||||
COMPOSE_PROJECT_NAME_TEST_UNIT ?= test_unit_$(BUILD_DIR_NAME)
|
||||
DOCKER_COMPOSE_TEST_UNIT = \
|
||||
COMPOSE_PROJECT_NAME=$(COMPOSE_PROJECT_NAME_TEST_UNIT) $(DOCKER_COMPOSE)
|
||||
|
||||
clean:
|
||||
-docker rmi ci/$(PROJECT_NAME):$(BRANCH_NAME)-$(BUILD_NUMBER)
|
||||
-docker rmi us-east1-docker.pkg.dev/overleaf-ops/ol-docker/$(PROJECT_NAME):$(BRANCH_NAME)-$(BUILD_NUMBER)
|
||||
-$(DOCKER_COMPOSE_TEST_UNIT) down --rmi local
|
||||
-$(DOCKER_COMPOSE_TEST_ACCEPTANCE) down --rmi local
|
||||
|
||||
HERE=$(shell pwd)
|
||||
MONOREPO=$(shell cd ../../ && pwd)
|
||||
# Run the linting commands in the scope of the monorepo.
|
||||
# Eslint and prettier (plus some configs) are on the root.
|
||||
RUN_LINTING = docker run --rm -v $(MONOREPO):$(MONOREPO) -w $(HERE) node:20.18.2 npm run --silent
|
||||
|
||||
RUN_LINTING_CI = docker run --rm --volume $(MONOREPO)/.editorconfig:/overleaf/.editorconfig --volume $(MONOREPO)/.eslintignore:/overleaf/.eslintignore --volume $(MONOREPO)/.eslintrc:/overleaf/.eslintrc --volume $(MONOREPO)/.prettierignore:/overleaf/.prettierignore --volume $(MONOREPO)/.prettierrc:/overleaf/.prettierrc --volume $(MONOREPO)/tsconfig.backend.json:/overleaf/tsconfig.backend.json ci/$(PROJECT_NAME):$(BRANCH_NAME)-$(BUILD_NUMBER) npm run --silent
|
||||
|
||||
# Same but from the top of the monorepo
|
||||
RUN_LINTING_MONOREPO = docker run --rm -v $(MONOREPO):$(MONOREPO) -w $(MONOREPO) node:20.18.2 npm run --silent
|
||||
|
||||
SHELLCHECK_OPTS = \
|
||||
--shell=bash \
|
||||
--external-sources
|
||||
SHELLCHECK_COLOR := $(if $(CI),--color=never,--color)
|
||||
SHELLCHECK_FILES := { git ls-files "*.sh" -z; git grep -Plz "\A\#\!.*bash"; } | sort -zu
|
||||
|
||||
shellcheck:
|
||||
@$(SHELLCHECK_FILES) | xargs -0 -r docker run --rm -v $(HERE):/mnt -w /mnt \
|
||||
koalaman/shellcheck:stable $(SHELLCHECK_OPTS) $(SHELLCHECK_COLOR)
|
||||
|
||||
shellcheck_fix:
|
||||
@$(SHELLCHECK_FILES) | while IFS= read -r -d '' file; do \
|
||||
diff=$$(docker run --rm -v $(HERE):/mnt -w /mnt koalaman/shellcheck:stable $(SHELLCHECK_OPTS) --format=diff "$$file" 2>/dev/null); \
|
||||
if [ -n "$$diff" ] && ! echo "$$diff" | patch -p1 >/dev/null 2>&1; then echo "\033[31m$$file\033[0m"; \
|
||||
elif [ -n "$$diff" ]; then echo "$$file"; \
|
||||
else echo "\033[2m$$file\033[0m"; fi \
|
||||
done
|
||||
|
||||
format:
|
||||
$(RUN_LINTING) format
|
||||
|
||||
format_ci:
|
||||
$(RUN_LINTING_CI) format
|
||||
|
||||
format_fix:
|
||||
$(RUN_LINTING) format:fix
|
||||
|
||||
lint:
|
||||
$(RUN_LINTING) lint
|
||||
|
||||
lint_ci:
|
||||
$(RUN_LINTING_CI) lint
|
||||
|
||||
lint_fix:
|
||||
$(RUN_LINTING) lint:fix
|
||||
|
||||
typecheck:
|
||||
$(RUN_LINTING) types:check
|
||||
|
||||
typecheck_ci:
|
||||
$(RUN_LINTING_CI) types:check
|
||||
|
||||
test: format lint typecheck shellcheck test_unit test_acceptance
|
||||
|
||||
test_unit:
|
||||
ifneq (,$(wildcard test/unit))
|
||||
$(DOCKER_COMPOSE_TEST_UNIT) run --rm test_unit
|
||||
$(MAKE) test_unit_clean
|
||||
endif
|
||||
|
||||
test_clean: test_unit_clean
|
||||
test_unit_clean:
|
||||
ifneq (,$(wildcard test/unit))
|
||||
$(DOCKER_COMPOSE_TEST_UNIT) down -v -t 0
|
||||
endif
|
||||
|
||||
test_acceptance: test_acceptance_clean test_acceptance_pre_run test_acceptance_run
|
||||
$(MAKE) test_acceptance_clean
|
||||
|
||||
test_acceptance_debug: test_acceptance_clean test_acceptance_pre_run test_acceptance_run_debug
|
||||
$(MAKE) test_acceptance_clean
|
||||
|
||||
test_acceptance_run:
|
||||
ifneq (,$(wildcard test/acceptance))
|
||||
$(DOCKER_COMPOSE_TEST_ACCEPTANCE) run --rm test_acceptance
|
||||
endif
|
||||
|
||||
test_acceptance_run_debug:
|
||||
ifneq (,$(wildcard test/acceptance))
|
||||
$(DOCKER_COMPOSE_TEST_ACCEPTANCE) run -p 127.0.0.9:19999:19999 --rm test_acceptance npm run test:acceptance -- --inspect=0.0.0.0:19999 --inspect-brk
|
||||
endif
|
||||
|
||||
test_clean: test_acceptance_clean
|
||||
test_acceptance_clean:
|
||||
$(DOCKER_COMPOSE_TEST_ACCEPTANCE) down -v -t 0
|
||||
|
||||
test_acceptance_pre_run:
|
||||
ifneq (,$(wildcard test/acceptance/js/scripts/pre-run))
|
||||
$(DOCKER_COMPOSE_TEST_ACCEPTANCE) run --rm test_acceptance test/acceptance/js/scripts/pre-run
|
||||
endif
|
||||
|
||||
benchmarks:
|
||||
$(DOCKER_COMPOSE_TEST_ACCEPTANCE) run --rm test_acceptance npm run benchmarks
|
||||
|
||||
build:
|
||||
docker build \
|
||||
--pull \
|
||||
--build-arg BUILDKIT_INLINE_CACHE=1 \
|
||||
--tag ci/$(PROJECT_NAME):$(BRANCH_NAME)-$(BUILD_NUMBER) \
|
||||
--tag us-east1-docker.pkg.dev/overleaf-ops/ol-docker/$(PROJECT_NAME):$(BRANCH_NAME)-$(BUILD_NUMBER) \
|
||||
--tag us-east1-docker.pkg.dev/overleaf-ops/ol-docker/$(PROJECT_NAME):$(BRANCH_NAME) \
|
||||
--cache-from us-east1-docker.pkg.dev/overleaf-ops/ol-docker/$(PROJECT_NAME):$(BRANCH_NAME) \
|
||||
--cache-from us-east1-docker.pkg.dev/overleaf-ops/ol-docker/$(PROJECT_NAME):main \
|
||||
--file Dockerfile \
|
||||
../..
|
||||
|
||||
tar:
|
||||
$(DOCKER_COMPOSE) up tar
|
||||
|
||||
publish:
|
||||
|
||||
docker push $(DOCKER_REPO)/$(PROJECT_NAME):$(BRANCH_NAME)-$(BUILD_NUMBER)
|
||||
|
||||
|
||||
.PHONY: clean \
|
||||
format format_fix \
|
||||
lint lint_fix \
|
||||
build_types typecheck \
|
||||
lint_ci format_ci typecheck_ci \
|
||||
shellcheck shellcheck_fix \
|
||||
test test_clean test_unit test_unit_clean \
|
||||
test_acceptance test_acceptance_debug test_acceptance_pre_run \
|
||||
test_acceptance_run test_acceptance_run_debug test_acceptance_clean \
|
||||
benchmarks \
|
||||
build tar publish \
|
10
services/references/README.md
Normal file
10
services/references/README.md
Normal file
|
@ -0,0 +1,10 @@
|
|||
overleaf/references
|
||||
===============
|
||||
|
||||
An API for providing citation-keys from user bib-files
|
||||
|
||||
License
|
||||
=======
|
||||
The code in this repository is released under the GNU AFFERO GENERAL PUBLIC LICENSE, version 3.
|
||||
|
||||
Based on https://github.com/overleaf/overleaf/commit/9964aebc794f9fd7ce1373ab3484f6b33b061af3
|
40
services/references/app.js
Normal file
40
services/references/app.js
Normal file
|
@ -0,0 +1,40 @@
|
|||
import '@overleaf/metrics/initialize.js'
|
||||
|
||||
import express from 'express'
|
||||
import Settings from '@overleaf/settings'
|
||||
import logger from '@overleaf/logger'
|
||||
import metrics from '@overleaf/metrics'
|
||||
import ReferencesAPIController from './app/js/ReferencesAPIController.js'
|
||||
import bodyParser from 'body-parser'
|
||||
|
||||
const app = express()
|
||||
metrics.injectMetricsRoute(app)
|
||||
|
||||
app.use(bodyParser.json({ limit: '2mb' }))
|
||||
app.use(metrics.http.monitor(logger))
|
||||
|
||||
app.post('/project/:project_id/index', ReferencesAPIController.index)
|
||||
app.get('/status', (req, res) => res.send({ status: 'references api is up' }))
|
||||
|
||||
const settings =
|
||||
Settings.internal && Settings.internal.references
|
||||
? Settings.internal.references
|
||||
: undefined
|
||||
const host = settings && settings.host ? settings.host : 'localhost'
|
||||
const port = settings && settings.port ? settings.port : 3056
|
||||
|
||||
logger.debug('Listening at', { host, port })
|
||||
|
||||
const server = app.listen(port, host, function (error) {
|
||||
if (error) {
|
||||
throw error
|
||||
}
|
||||
logger.info({ host, port }, 'references HTTP server starting up')
|
||||
})
|
||||
|
||||
process.on('SIGTERM', () => {
|
||||
server.close(() => {
|
||||
logger.info({ host, port }, 'references HTTP server closed')
|
||||
metrics.close()
|
||||
})
|
||||
})
|
42
services/references/app/js/ReferencesAPIController.js
Normal file
42
services/references/app/js/ReferencesAPIController.js
Normal file
|
@ -0,0 +1,42 @@
|
|||
import logger from '@overleaf/logger'
|
||||
import BibtexParser from './bib2json.js'
|
||||
|
||||
export default {
|
||||
async index(req, res) {
|
||||
const { docUrls, fullIndex } = req.body
|
||||
try {
|
||||
const responses = await Promise.all(
|
||||
docUrls.map(async (docUrl) => {
|
||||
try {
|
||||
const response = await fetch(docUrl)
|
||||
if (!response.ok) {
|
||||
throw new Error(`HTTP error! status: ${response.status}`)
|
||||
}
|
||||
return response.text()
|
||||
} catch (error) {
|
||||
logger.error({ error }, "Failed to fetch document from URL: " + docUrl)
|
||||
return null
|
||||
}
|
||||
})
|
||||
)
|
||||
const keys = []
|
||||
for (const body of responses) {
|
||||
if (!body) continue
|
||||
|
||||
try {
|
||||
const parsedEntries = BibtexParser(body).entries
|
||||
const ks = parsedEntries
|
||||
.filter(entry => entry.EntryKey)
|
||||
.map(entry => entry.EntryKey)
|
||||
keys.push(...ks)
|
||||
} catch (error) {
|
||||
logger.error({ error }, "bib file skipped.")
|
||||
}
|
||||
}
|
||||
res.status(200).json({ keys })
|
||||
} catch (error) {
|
||||
logger.error({ error }, "Unexpected error during indexing process.")
|
||||
res.status(500).json({ error: "Failed to process bib files." })
|
||||
}
|
||||
}
|
||||
}
|
1967
services/references/app/js/bib2json.js
Normal file
1967
services/references/app/js/bib2json.js
Normal file
File diff suppressed because it is too large
Load diff
9
services/references/buildscript.txt
Normal file
9
services/references/buildscript.txt
Normal file
|
@ -0,0 +1,9 @@
|
|||
references
|
||||
--dependencies=mongo
|
||||
--docker-repos=us-east1-docker.pkg.dev/overleaf-ops/ol-docker
|
||||
--env-add=
|
||||
--env-pass-through=
|
||||
--esmock-loader=True
|
||||
--node-version=20.18.2
|
||||
--public-repo=False
|
||||
--script-version=4.5.0
|
9
services/references/config/settings.defaults.cjs
Normal file
9
services/references/config/settings.defaults.cjs
Normal file
|
@ -0,0 +1,9 @@
|
|||
module.exports = {
|
||||
internal: {
|
||||
references: {
|
||||
port: 3056,
|
||||
host: process.env.REFERENCES_HOST || '127.0.0.1',
|
||||
},
|
||||
},
|
||||
}
|
||||
|
52
services/references/docker-compose.ci.yml
Normal file
52
services/references/docker-compose.ci.yml
Normal file
|
@ -0,0 +1,52 @@
|
|||
# This file was auto-generated, do not edit it directly.
|
||||
# Instead run bin/update_build_scripts from
|
||||
# https://github.com/overleaf/internal/
|
||||
|
||||
version: "2.3"
|
||||
|
||||
services:
|
||||
test_unit:
|
||||
image: ci/$PROJECT_NAME:$BRANCH_NAME-$BUILD_NUMBER
|
||||
user: node
|
||||
command: npm run test:unit:_run
|
||||
environment:
|
||||
NODE_ENV: test
|
||||
NODE_OPTIONS: "--unhandled-rejections=strict"
|
||||
|
||||
|
||||
test_acceptance:
|
||||
build: .
|
||||
image: ci/$PROJECT_NAME:$BRANCH_NAME-$BUILD_NUMBER
|
||||
environment:
|
||||
ELASTIC_SEARCH_DSN: es:9200
|
||||
MONGO_HOST: mongo
|
||||
POSTGRES_HOST: postgres
|
||||
MOCHA_GREP: ${MOCHA_GREP}
|
||||
NODE_ENV: test
|
||||
NODE_OPTIONS: "--unhandled-rejections=strict"
|
||||
depends_on:
|
||||
mongo:
|
||||
condition: service_started
|
||||
user: node
|
||||
command: npm run test:acceptance
|
||||
|
||||
|
||||
tar:
|
||||
build: .
|
||||
image: ci/$PROJECT_NAME:$BRANCH_NAME-$BUILD_NUMBER
|
||||
volumes:
|
||||
- ./:/tmp/build/
|
||||
command: tar -czf /tmp/build/build.tar.gz --exclude=build.tar.gz --exclude-vcs .
|
||||
user: root
|
||||
mongo:
|
||||
image: mongo:6.0.13
|
||||
command: --replSet overleaf
|
||||
volumes:
|
||||
- ../../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
environment:
|
||||
MONGO_INITDB_DATABASE: sharelatex
|
||||
extra_hosts:
|
||||
# Required when using the automatic database setup for initializing the
|
||||
# replica set. This override is not needed when running the setup after
|
||||
# starting up mongo.
|
||||
- mongo:127.0.0.1
|
56
services/references/docker-compose.yml
Normal file
56
services/references/docker-compose.yml
Normal file
|
@ -0,0 +1,56 @@
|
|||
# This file was auto-generated, do not edit it directly.
|
||||
# Instead run bin/update_build_scripts from
|
||||
# https://github.com/overleaf/internal/
|
||||
|
||||
version: "2.3"
|
||||
|
||||
services:
|
||||
test_unit:
|
||||
image: node:20.18.2
|
||||
volumes:
|
||||
- .:/overleaf/services/references
|
||||
- ../../node_modules:/overleaf/node_modules
|
||||
- ../../libraries:/overleaf/libraries
|
||||
working_dir: /overleaf/services/references
|
||||
environment:
|
||||
MOCHA_GREP: ${MOCHA_GREP}
|
||||
LOG_LEVEL: ${LOG_LEVEL:-}
|
||||
NODE_ENV: test
|
||||
NODE_OPTIONS: "--unhandled-rejections=strict"
|
||||
command: npm run --silent test:unit
|
||||
user: node
|
||||
|
||||
test_acceptance:
|
||||
image: node:20.18.2
|
||||
volumes:
|
||||
- .:/overleaf/services/references
|
||||
- ../../node_modules:/overleaf/node_modules
|
||||
- ../../libraries:/overleaf/libraries
|
||||
working_dir: /overleaf/services/references
|
||||
environment:
|
||||
ELASTIC_SEARCH_DSN: es:9200
|
||||
MONGO_HOST: mongo
|
||||
POSTGRES_HOST: postgres
|
||||
MOCHA_GREP: ${MOCHA_GREP}
|
||||
LOG_LEVEL: ${LOG_LEVEL:-}
|
||||
NODE_ENV: test
|
||||
NODE_OPTIONS: "--unhandled-rejections=strict"
|
||||
user: node
|
||||
depends_on:
|
||||
mongo:
|
||||
condition: service_started
|
||||
command: npm run --silent test:acceptance
|
||||
|
||||
mongo:
|
||||
image: mongo:6.0.13
|
||||
command: --replSet overleaf
|
||||
volumes:
|
||||
- ../../bin/shared/mongodb-init-replica-set.js:/docker-entrypoint-initdb.d/mongodb-init-replica-set.js
|
||||
environment:
|
||||
MONGO_INITDB_DATABASE: sharelatex
|
||||
extra_hosts:
|
||||
# Required when using the automatic database setup for initializing the
|
||||
# replica set. This override is not needed when running the setup after
|
||||
# starting up mongo.
|
||||
- mongo:127.0.0.1
|
||||
|
26
services/references/package.json
Normal file
26
services/references/package.json
Normal file
|
@ -0,0 +1,26 @@
|
|||
{
|
||||
"name": "@overleaf/references",
|
||||
"description": "An API for providing citation-keys",
|
||||
"private": true,
|
||||
"type": "module",
|
||||
"main": "app.js",
|
||||
"scripts": {
|
||||
"start": "node app.js"
|
||||
},
|
||||
"version": "0.1.0",
|
||||
"dependencies": {
|
||||
"@overleaf/settings": "*",
|
||||
"@overleaf/logger": "*",
|
||||
"@overleaf/metrics": "*",
|
||||
"async": "^3.2.5",
|
||||
"express": "^4.21.2"
|
||||
},
|
||||
"devDependencies": {
|
||||
"chai": "^4.3.6",
|
||||
"chai-as-promised": "^7.1.1",
|
||||
"esmock": "^2.6.9",
|
||||
"mocha": "^11.1.0",
|
||||
"sinon": "^9.2.4",
|
||||
"typescript": "^5.0.4"
|
||||
}
|
||||
}
|
12
services/references/tsconfig.json
Normal file
12
services/references/tsconfig.json
Normal file
|
@ -0,0 +1,12 @@
|
|||
{
|
||||
"extends": "../../tsconfig.backend.json",
|
||||
"include": [
|
||||
"app.js",
|
||||
"app/js/**/*",
|
||||
"benchmarks/**/*",
|
||||
"config/**/*",
|
||||
"scripts/**/*",
|
||||
"test/**/*",
|
||||
"types"
|
||||
]
|
||||
}
|
|
@ -56,14 +56,8 @@ if (Settings.catchErrors) {
|
|||
// Create ./data/dumpFolder if needed
|
||||
FileWriter.ensureDumpFolderExists()
|
||||
|
||||
if (
|
||||
!Features.hasFeature('project-history-blobs') &&
|
||||
!Features.hasFeature('filestore')
|
||||
) {
|
||||
throw new Error(
|
||||
'invalid config: must enable either project-history-blobs (Settings.enableProjectHistoryBlobs=true) or enable filestore (Settings.disableFilestore=false)'
|
||||
)
|
||||
}
|
||||
// Validate combination of feature flags.
|
||||
Features.validateSettings()
|
||||
|
||||
// handle SIGTERM for graceful shutdown in kubernetes
|
||||
process.on('SIGTERM', function (signal) {
|
||||
|
|
|
@ -36,7 +36,22 @@ function send401WithChallenge(res) {
|
|||
function checkCredentials(userDetailsMap, user, password) {
|
||||
const expectedPassword = userDetailsMap.get(user)
|
||||
const userExists = userDetailsMap.has(user) && expectedPassword // user exists with a non-null password
|
||||
const isValid = userExists && tsscmp(expectedPassword, password)
|
||||
|
||||
let isValid = false
|
||||
if (userExists) {
|
||||
if (Array.isArray(expectedPassword)) {
|
||||
const isValidPrimary = Boolean(
|
||||
expectedPassword[0] && tsscmp(expectedPassword[0], password)
|
||||
)
|
||||
const isValidFallback = Boolean(
|
||||
expectedPassword[1] && tsscmp(expectedPassword[1], password)
|
||||
)
|
||||
isValid = isValidPrimary || isValidFallback
|
||||
} else {
|
||||
isValid = tsscmp(expectedPassword, password)
|
||||
}
|
||||
}
|
||||
|
||||
if (!isValid) {
|
||||
logger.err({ user }, 'invalid login details')
|
||||
}
|
||||
|
@ -82,6 +97,7 @@ const AuthenticationController = {
|
|||
analyticsId: user.analyticsId || user._id,
|
||||
alphaProgram: user.alphaProgram || undefined, // only store if set
|
||||
betaProgram: user.betaProgram || undefined, // only store if set
|
||||
externalAuth: user.externalAuth || false,
|
||||
}
|
||||
if (user.isAdmin) {
|
||||
lightUser.isAdmin = true
|
||||
|
|
|
@ -692,7 +692,7 @@ async function _getContentFromMongo(projectId) {
|
|||
|
||||
function _finaliseRequest(projectId, options, project, docs, files) {
|
||||
const resources = []
|
||||
let flags
|
||||
let flags = []
|
||||
let rootResourcePath = null
|
||||
let rootResourcePathOverride = null
|
||||
let hasMainFile = false
|
||||
|
@ -771,6 +771,10 @@ function _finaliseRequest(projectId, options, project, docs, files) {
|
|||
flags = ['-file-line-error']
|
||||
}
|
||||
|
||||
if (process.env.TEX_COMPILER_EXTRA_FLAGS) {
|
||||
flags.push(...process.env.TEX_COMPILER_EXTRA_FLAGS.split(/\s+/).filter(Boolean))
|
||||
}
|
||||
|
||||
return {
|
||||
compile: {
|
||||
options: {
|
||||
|
|
|
@ -7,6 +7,7 @@ import logger from '@overleaf/logger'
|
|||
import _ from 'lodash'
|
||||
import { plainTextResponse } from '../../infrastructure/Response.js'
|
||||
import { expressify } from '@overleaf/promise-utils'
|
||||
import Modules from '../../infrastructure/Modules.js'
|
||||
|
||||
async function getDocument(req, res) {
|
||||
const { Project_id: projectId, doc_id: docId } = req.params
|
||||
|
@ -92,6 +93,9 @@ async function setDocument(req, res) {
|
|||
{ docId, projectId },
|
||||
'finished receiving set document request from api (docupdater)'
|
||||
)
|
||||
|
||||
await Modules.promises.hooks.fire('docModified', projectId, docId)
|
||||
|
||||
res.json(result)
|
||||
}
|
||||
|
||||
|
|
|
@ -32,14 +32,6 @@ module.exports = {
|
|||
getCanonicalURL,
|
||||
getSafeRedirectPath,
|
||||
getSafeAdminDomainRedirect,
|
||||
wrapUrlWithProxy(url) {
|
||||
// TODO: Consider what to do for Community and Enterprise edition?
|
||||
if (!Settings.apis.linkedUrlProxy.url) {
|
||||
throw new Error('no linked url proxy configured')
|
||||
}
|
||||
return `${Settings.apis.linkedUrlProxy.url}?url=${encodeURIComponent(url)}`
|
||||
},
|
||||
|
||||
prependHttpIfNeeded(url) {
|
||||
if (!url.match('://')) {
|
||||
url = `http://${url}`
|
||||
|
|
|
@ -8,7 +8,7 @@ function projectHistoryURLWithFilestoreFallback(
|
|||
) {
|
||||
const filestoreURL = `${Settings.apis.filestore.url}/project/${projectId}/file/${fileRef._id}?from=${origin}`
|
||||
// TODO: When this file is converted to ES modules we will be able to use Features.hasFeature('project-history-blobs'). Currently we can't stub the feature return value in tests.
|
||||
if (fileRef.hash && Settings.enableProjectHistoryBlobs) {
|
||||
if (fileRef.hash && Settings.filestoreMigrationLevel >= 1) {
|
||||
return {
|
||||
url: `${Settings.apis.project_history.url}/project/${historyId}/blob/${fileRef.hash}`,
|
||||
fallbackURL: filestoreURL,
|
||||
|
|
|
@ -72,7 +72,6 @@ function _getUrl(projectId, data, currentUserId) {
|
|||
if (!urlValidator.isWebUri(url)) {
|
||||
throw new InvalidUrlError(`invalid url: ${url}`)
|
||||
}
|
||||
url = UrlHelper.wrapUrlWithProxy(url)
|
||||
return url
|
||||
}
|
||||
|
||||
|
|
|
@ -72,6 +72,7 @@ async function getUserForPasswordResetToken(token) {
|
|||
'overleaf.id': 1,
|
||||
email: 1,
|
||||
must_reconfirm: 1,
|
||||
hashedPassword: 1,
|
||||
})
|
||||
|
||||
await assertUserPermissions(user, ['change-password'])
|
||||
|
|
|
@ -590,7 +590,7 @@ const _ProjectController = {
|
|||
}
|
||||
|
||||
const isAdminOrTemplateOwner =
|
||||
hasAdminAccess(user) || Settings.templates?.user_id === userId
|
||||
hasAdminAccess(user) || Settings.templates?.nonAdminCanManage
|
||||
const showTemplatesServerPro =
|
||||
Features.hasFeature('templates-server-pro') && isAdminOrTemplateOwner
|
||||
|
||||
|
|
|
@ -4,7 +4,7 @@ const Path = require('path')
|
|||
const Features = require('../../infrastructure/Features')
|
||||
|
||||
module.exports = ProjectEditorHandler = {
|
||||
trackChangesAvailable: false,
|
||||
trackChangesAvailable: true,
|
||||
|
||||
buildProjectModelView(
|
||||
project,
|
||||
|
@ -27,7 +27,7 @@ module.exports = ProjectEditorHandler = {
|
|||
deletedByExternalDataSource: project.deletedByExternalDataSource || false,
|
||||
imageName:
|
||||
project.imageName != null
|
||||
? Path.basename(project.imageName)
|
||||
? project.imageName
|
||||
: undefined,
|
||||
}
|
||||
|
||||
|
|
|
@ -24,7 +24,6 @@ const ProjectOptionsHandler = {
|
|||
if (!imageName || !Array.isArray(settings.allowedImageNames)) {
|
||||
return
|
||||
}
|
||||
imageName = imageName.toLowerCase()
|
||||
const isAllowed = settings.allowedImageNames.find(
|
||||
allowed => imageName === allowed.imageName
|
||||
)
|
||||
|
@ -32,7 +31,7 @@ const ProjectOptionsHandler = {
|
|||
throw new Error(`invalid imageName: ${imageName}`)
|
||||
}
|
||||
const conditions = { _id: projectId }
|
||||
const update = { imageName: settings.imageRoot + '/' + imageName }
|
||||
const update = { imageName: imageName }
|
||||
return Project.updateOne(conditions, update, {})
|
||||
},
|
||||
|
||||
|
|
|
@ -7,21 +7,22 @@ const { expressify } = require('@overleaf/promise-utils')
|
|||
|
||||
const TemplatesController = {
|
||||
async getV1Template(req, res) {
|
||||
const templateVersionId = req.params.Template_version_id
|
||||
const templateId = req.query.id
|
||||
if (!/^[0-9]+$/.test(templateVersionId) || !/^[0-9]+$/.test(templateId)) {
|
||||
logger.err(
|
||||
{ templateVersionId, templateId },
|
||||
'invalid template id or version'
|
||||
)
|
||||
return res.sendStatus(400)
|
||||
}
|
||||
const templateId = req.params.Template_version_id
|
||||
const templateVersionId = req.query.version
|
||||
// if (!/^[0-9]+$/.test(templateVersionId) || !/^[0-9]+$/.test(templateId)) {
|
||||
// logger.err(
|
||||
// { templateVersionId, templateId },
|
||||
// 'invalid template id or version'
|
||||
// )
|
||||
// return res.sendStatus(400)
|
||||
// }
|
||||
const data = {
|
||||
templateVersionId,
|
||||
templateId,
|
||||
name: req.query.templateName,
|
||||
compiler: ProjectHelper.compilerFromV1Engine(req.query.latexEngine),
|
||||
imageName: req.query.texImage,
|
||||
name: req.query.name,
|
||||
compiler: req.query.compiler,
|
||||
language: req.query.language,
|
||||
imageName: req.query.imageName,
|
||||
mainFile: req.query.mainFile,
|
||||
brandVariationId: req.query.brandVariationId,
|
||||
}
|
||||
|
@ -36,6 +37,7 @@ const TemplatesController = {
|
|||
|
||||
async createProjectFromV1Template(req, res) {
|
||||
const userId = SessionManager.getLoggedInUserId(req.session)
|
||||
|
||||
const project = await TemplatesManager.promises.createProjectFromV1Template(
|
||||
req.body.brandVariationId,
|
||||
req.body.compiler,
|
||||
|
@ -44,7 +46,8 @@ const TemplatesController = {
|
|||
req.body.templateName,
|
||||
req.body.templateVersionId,
|
||||
userId,
|
||||
req.body.imageName
|
||||
req.body.imageName,
|
||||
req.body.language
|
||||
)
|
||||
delete req.session.templateData
|
||||
if (!project) {
|
||||
|
|
|
@ -18,6 +18,7 @@ const crypto = require('crypto')
|
|||
const Errors = require('../Errors/Errors')
|
||||
const { pipeline } = require('stream/promises')
|
||||
const ClsiCacheManager = require('../Compile/ClsiCacheManager')
|
||||
const TIMEOUT = 30000 // 30 sec
|
||||
|
||||
const TemplatesManager = {
|
||||
async createProjectFromV1Template(
|
||||
|
@ -28,25 +29,19 @@ const TemplatesManager = {
|
|||
templateName,
|
||||
templateVersionId,
|
||||
userId,
|
||||
imageName
|
||||
imageName,
|
||||
language
|
||||
) {
|
||||
const zipUrl = `${settings.apis.v1.url}/api/v1/overleaf/templates/${templateVersionId}`
|
||||
const zipUrl = `${settings.apis.filestore.url}/template/${templateId}/v/${templateVersionId}/zip`
|
||||
const zipReq = await fetchStreamWithResponse(zipUrl, {
|
||||
basicAuth: {
|
||||
user: settings.apis.v1.user,
|
||||
password: settings.apis.v1.pass,
|
||||
},
|
||||
signal: AbortSignal.timeout(settings.apis.v1.timeout),
|
||||
signal: AbortSignal.timeout(TIMEOUT),
|
||||
})
|
||||
|
||||
const projectName = ProjectDetailsHandler.fixProjectName(templateName)
|
||||
const dumpPath = `${settings.path.dumpFolder}/${crypto.randomUUID()}`
|
||||
const writeStream = fs.createWriteStream(dumpPath)
|
||||
try {
|
||||
const attributes = {
|
||||
fromV1TemplateId: templateId,
|
||||
fromV1TemplateVersionId: templateVersionId,
|
||||
}
|
||||
const attributes = {}
|
||||
await pipeline(zipReq.stream, writeStream)
|
||||
|
||||
if (zipReq.response.status !== 200) {
|
||||
|
@ -78,14 +73,9 @@ const TemplatesManager = {
|
|||
await TemplatesManager._setCompiler(project._id, compiler)
|
||||
await TemplatesManager._setImage(project._id, imageName)
|
||||
await TemplatesManager._setMainFile(project._id, mainFile)
|
||||
await TemplatesManager._setSpellCheckLanguage(project._id, language)
|
||||
await TemplatesManager._setBrandVariationId(project._id, brandVariationId)
|
||||
|
||||
const update = {
|
||||
fromV1TemplateId: templateId,
|
||||
fromV1TemplateVersionId: templateVersionId,
|
||||
}
|
||||
await Project.updateOne({ _id: project._id }, update, {})
|
||||
|
||||
await prepareClsiCacheInBackground
|
||||
|
||||
return project
|
||||
|
@ -102,11 +92,12 @@ const TemplatesManager = {
|
|||
},
|
||||
|
||||
async _setImage(projectId, imageName) {
|
||||
if (!imageName) {
|
||||
imageName = 'wl_texlive:2018.1'
|
||||
try {
|
||||
await ProjectOptionsHandler.setImageName(projectId, imageName)
|
||||
} catch {
|
||||
logger.warn({ imageName: imageName }, 'not available')
|
||||
await ProjectOptionsHandler.setImageName(projectId, settings.currentImageName)
|
||||
}
|
||||
|
||||
await ProjectOptionsHandler.setImageName(projectId, imageName)
|
||||
},
|
||||
|
||||
async _setMainFile(projectId, mainFile) {
|
||||
|
@ -116,6 +107,13 @@ const TemplatesManager = {
|
|||
await ProjectRootDocManager.setRootDocFromName(projectId, mainFile)
|
||||
},
|
||||
|
||||
async _setSpellCheckLanguage(projectId, language) {
|
||||
if (language == null) {
|
||||
return
|
||||
}
|
||||
await ProjectOptionsHandler.setSpellCheckLanguage(projectId, language)
|
||||
},
|
||||
|
||||
async _setBrandVariationId(projectId, brandVariationId) {
|
||||
if (brandVariationId == null) {
|
||||
return
|
||||
|
|
|
@ -66,7 +66,7 @@ function uploadProject(req, res, next) {
|
|||
async function uploadFile(req, res, next) {
|
||||
const timer = new metrics.Timer('file-upload')
|
||||
const name = req.body.name
|
||||
const path = req.file?.path
|
||||
const { path } = req.file
|
||||
const projectId = req.params.Project_id
|
||||
const userId = SessionManager.getLoggedInUserId(req.session)
|
||||
let { folder_id: folderId } = req.query
|
||||
|
@ -162,8 +162,14 @@ function multerMiddleware(req, res, next) {
|
|||
.status(422)
|
||||
.json({ success: false, error: req.i18n.translate('file_too_large') })
|
||||
}
|
||||
|
||||
return next(err)
|
||||
if (err) return next(err)
|
||||
if (!req.file?.path) {
|
||||
logger.info({ req }, 'missing req.file.path on upload')
|
||||
return res
|
||||
.status(400)
|
||||
.json({ success: false, error: 'invalid_upload_request' })
|
||||
}
|
||||
next()
|
||||
})
|
||||
}
|
||||
|
||||
|
|
|
@ -518,4 +518,5 @@ module.exports = {
|
|||
expireDeletedUsersAfterDuration: expressify(expireDeletedUsersAfterDuration),
|
||||
ensureAffiliationMiddleware: expressify(ensureAffiliationMiddleware),
|
||||
ensureAffiliation,
|
||||
doLogout,
|
||||
}
|
||||
|
|
|
@ -52,10 +52,8 @@ async function settingsPage(req, res) {
|
|||
const reconfirmedViaSAML = _.get(req.session, ['saml', 'reconfirmed'])
|
||||
delete req.session.saml
|
||||
let shouldAllowEditingDetails = true
|
||||
if (Settings.ldap && Settings.ldap.updateUserDetailsOnLogin) {
|
||||
shouldAllowEditingDetails = false
|
||||
}
|
||||
if (Settings.saml && Settings.saml.updateUserDetailsOnLogin) {
|
||||
const externalAuth = req.user.externalAuth
|
||||
if (externalAuth && Settings[externalAuth].updateUserDetailsOnLogin) {
|
||||
shouldAllowEditingDetails = false
|
||||
}
|
||||
const oauthProviders = Settings.oauthProviders || {}
|
||||
|
|
|
@ -107,9 +107,9 @@ module.exports = function (webRouter, privateApiRouter, publicApiRouter) {
|
|||
|
||||
webRouter.use(function (req, res, next) {
|
||||
req.externalAuthenticationSystemUsed =
|
||||
Features.externalAuthenticationSystemUsed
|
||||
() => !!req?.user?.externalAuth
|
||||
res.locals.externalAuthenticationSystemUsed =
|
||||
Features.externalAuthenticationSystemUsed
|
||||
() => !!req?.user?.externalAuth
|
||||
req.hasFeature = res.locals.hasFeature = Features.hasFeature
|
||||
next()
|
||||
})
|
||||
|
@ -434,7 +434,7 @@ module.exports = function (webRouter, privateApiRouter, publicApiRouter) {
|
|||
labsEnabled: Settings.labs && Settings.labs.enable,
|
||||
wikiEnabled: Settings.overleaf != null || Settings.proxyLearn,
|
||||
templatesEnabled:
|
||||
Settings.overleaf != null || Settings.templates?.user_id != null,
|
||||
Settings.overleaf != null || Boolean(Settings.templates),
|
||||
cioWriteKey: Settings.analytics?.cio?.writeKey,
|
||||
cioSiteId: Settings.analytics?.cio?.siteId,
|
||||
}
|
||||
|
|
|
@ -12,15 +12,12 @@ const trackChangesModuleAvailable =
|
|||
/**
|
||||
* @typedef {Object} Settings
|
||||
* @property {Object | undefined} apis
|
||||
* @property {Object | undefined} apis.linkedUrlProxy
|
||||
* @property {string | undefined} apis.linkedUrlProxy.url
|
||||
* @property {Object | undefined} apis.references
|
||||
* @property {string | undefined} apis.references.url
|
||||
* @property {boolean | undefined} enableGithubSync
|
||||
* @property {boolean | undefined} enableGitBridge
|
||||
* @property {boolean | undefined} enableHomepage
|
||||
* @property {boolean | undefined} enableProjectHistoryBlobs
|
||||
* @property {boolean | undefined} disableFilestore
|
||||
* @property {number} filestoreMigrationLevel
|
||||
* @property {boolean | undefined} enableSaml
|
||||
* @property {boolean | undefined} ldap
|
||||
* @property {boolean | undefined} oauth
|
||||
|
@ -30,6 +27,14 @@ const trackChangesModuleAvailable =
|
|||
*/
|
||||
|
||||
const Features = {
|
||||
validateSettings() {
|
||||
if (![0, 1, 2].includes(Settings.filestoreMigrationLevel)) {
|
||||
throw new Error(
|
||||
`invalid OVERLEAF_FILESTORE_MIGRATION_LEVEL=${Settings.filestoreMigrationLevel}, expected 0, 1 or 2`
|
||||
)
|
||||
}
|
||||
},
|
||||
|
||||
/**
|
||||
* @returns {boolean}
|
||||
*/
|
||||
|
@ -56,7 +61,7 @@ const Features = {
|
|||
case 'registration-page':
|
||||
return (
|
||||
!Features.externalAuthenticationSystemUsed() ||
|
||||
Boolean(Settings.overleaf)
|
||||
Boolean(Settings.overleaf) || Settings.oidc?.allowedOIDCEmailDomains
|
||||
)
|
||||
case 'registration':
|
||||
return Boolean(Settings.overleaf)
|
||||
|
@ -69,7 +74,7 @@ const Features = {
|
|||
case 'oauth':
|
||||
return Boolean(Settings.oauth)
|
||||
case 'templates-server-pro':
|
||||
return Boolean(Settings.templates?.user_id)
|
||||
return Boolean(Settings.templates)
|
||||
case 'affiliations':
|
||||
case 'analytics':
|
||||
return Boolean(_.get(Settings, ['apis', 'v1', 'url']))
|
||||
|
@ -85,13 +90,12 @@ const Features = {
|
|||
)
|
||||
case 'link-url':
|
||||
return Boolean(
|
||||
_.get(Settings, ['apis', 'linkedUrlProxy', 'url']) &&
|
||||
Settings.enabledLinkedFileTypes.includes('url')
|
||||
Settings.enabledLinkedFileTypes.includes('url')
|
||||
)
|
||||
case 'project-history-blobs':
|
||||
return Boolean(Settings.enableProjectHistoryBlobs)
|
||||
return Settings.filestoreMigrationLevel > 0
|
||||
case 'filestore':
|
||||
return Boolean(Settings.disableFilestore) === false
|
||||
return Settings.filestoreMigrationLevel < 2
|
||||
case 'support':
|
||||
return supportModuleAvailable
|
||||
case 'symbol-palette':
|
||||
|
|
|
@ -150,8 +150,7 @@ async function linkedFileAgentsIncludes() {
|
|||
async function attachHooks() {
|
||||
for (const module of await modules()) {
|
||||
const { promises, ...hooks } = module.hooks || {}
|
||||
for (const hook in promises || {}) {
|
||||
const method = promises[hook]
|
||||
for (const [hook, method] of Object.entries(promises || {})) {
|
||||
attachHook(hook, method)
|
||||
}
|
||||
for (const hook in hooks || {}) {
|
||||
|
|
|
@ -217,6 +217,8 @@ async function initialize(webRouter, privateApiRouter, publicApiRouter) {
|
|||
CaptchaMiddleware.canSkipCaptcha
|
||||
)
|
||||
|
||||
await Modules.applyRouter(webRouter, privateApiRouter, publicApiRouter)
|
||||
|
||||
webRouter.get('/login', UserPagesController.loginPage)
|
||||
AuthenticationController.addEndpointToLoginWhitelist('/login')
|
||||
|
||||
|
@ -262,6 +264,8 @@ async function initialize(webRouter, privateApiRouter, publicApiRouter) {
|
|||
'/read-only/one-time-login'
|
||||
)
|
||||
|
||||
await Modules.applyRouter(webRouter, privateApiRouter, publicApiRouter)
|
||||
|
||||
webRouter.post('/logout', UserController.logout)
|
||||
|
||||
webRouter.get('/restricted', AuthorizationMiddleware.restricted)
|
||||
|
@ -285,8 +289,6 @@ async function initialize(webRouter, privateApiRouter, publicApiRouter) {
|
|||
TokenAccessRouter.apply(webRouter)
|
||||
HistoryRouter.apply(webRouter, privateApiRouter)
|
||||
|
||||
await Modules.applyRouter(webRouter, privateApiRouter, publicApiRouter)
|
||||
|
||||
if (Settings.enableSubscriptions) {
|
||||
webRouter.get(
|
||||
'/user/bonus',
|
||||
|
@ -1271,6 +1273,10 @@ async function initialize(webRouter, privateApiRouter, publicApiRouter) {
|
|||
TokenAccessController.grantTokenAccessReadOnly
|
||||
)
|
||||
|
||||
webRouter.get(['/learn*', '/blog*', '/latex*', '/for/*', '/contact*'], (req, res) => {
|
||||
res.redirect(301, `https://www.overleaf.com${req.originalUrl}`)
|
||||
})
|
||||
|
||||
webRouter.get('/unsupported-browser', renderUnsupportedBrowserPage)
|
||||
|
||||
webRouter.get('*', ErrorController.notFound)
|
||||
|
|
|
@ -1,13 +1,13 @@
|
|||
section.cookie-banner.hidden-print.hidden(aria-label='Cookie banner')
|
||||
.cookie-banner-content We only use cookies for essential purposes and to improve your experience on our site. You can find out more in our <a href="/legal#Cookies">cookie policy</a>.
|
||||
section.cookie-banner.hidden-print.hidden(aria-label=translate('cookie_banner'))
|
||||
.cookie-banner-content !{translate('cookie_banner_info', {}, [{ name: 'a', attrs: { href: '/legal#Cookies' }}])}
|
||||
.cookie-banner-actions
|
||||
button(
|
||||
type='button'
|
||||
class='btn btn-link btn-sm'
|
||||
data-ol-cookie-banner-set-consent='essential'
|
||||
) Essential cookies only
|
||||
) #{translate('essential_cookies_only')}
|
||||
button(
|
||||
type='button'
|
||||
class='btn btn-primary btn-sm'
|
||||
data-ol-cookie-banner-set-consent='all'
|
||||
) Accept all cookies
|
||||
) #{translate('accept_all_cookies')}
|
||||
|
|
|
@ -4,7 +4,7 @@ block vars
|
|||
- var suppressNavbar = true
|
||||
- var suppressFooter = true
|
||||
- var suppressSkipToContent = true
|
||||
- var suppressCookieBanner = true
|
||||
- var suppressPugCookieBanner = true
|
||||
|
||||
block content
|
||||
.content.content-alt
|
||||
|
|
|
@ -24,7 +24,7 @@ block body
|
|||
else
|
||||
include layout/fat-footer
|
||||
|
||||
if typeof suppressCookieBanner == 'undefined'
|
||||
if typeof suppressPugCookieBanner == 'undefined'
|
||||
include _cookie_banner
|
||||
|
||||
if bootstrapVersion === 5
|
||||
|
|
|
@ -69,5 +69,5 @@ block body
|
|||
else
|
||||
include layout/fat-footer-react-bootstrap-5
|
||||
|
||||
if typeof suppressCookieBanner === 'undefined'
|
||||
if typeof suppressPugCookieBanner === 'undefined'
|
||||
include _cookie_banner
|
||||
|
|
|
@ -27,7 +27,7 @@ block body
|
|||
else
|
||||
include layout/fat-footer-website-redesign
|
||||
|
||||
if typeof suppressCookieBanner == 'undefined'
|
||||
if typeof suppressPugCookieBanner == 'undefined'
|
||||
include _cookie_banner
|
||||
|
||||
block contactModal
|
||||
|
|
|
@ -161,6 +161,18 @@ nav.navbar.navbar-default.navbar-main.navbar-expand-lg(
|
|||
event-segmentation={page: currentUrl, item: 'register', location: 'top-menu'}
|
||||
) #{translate('sign_up')}
|
||||
|
||||
// templates link
|
||||
if settings.templates
|
||||
+nav-item
|
||||
+nav-link(
|
||||
href="/templates"
|
||||
event-tracking="menu-click"
|
||||
event-tracking-action="clicked"
|
||||
event-tracking-trigger="click"
|
||||
event-tracking-mb="true"
|
||||
event-segmentation={ page: currentUrl, item: 'templates', location: 'top-menu' }
|
||||
) #{translate('templates')}
|
||||
|
||||
// login link
|
||||
+nav-item
|
||||
+nav-link(
|
||||
|
|
|
@ -159,6 +159,18 @@ nav.navbar.navbar-default.navbar-main(
|
|||
|
||||
// logged out
|
||||
if !getSessionUser()
|
||||
// templates link
|
||||
if settings.templates
|
||||
li
|
||||
a(
|
||||
href="/templates"
|
||||
event-tracking="menu-click"
|
||||
event-tracking-action="clicked"
|
||||
event-tracking-trigger="click"
|
||||
event-tracking-mb="true"
|
||||
event-segmentation={ page: currentUrl, item: 'templates', location: 'top-menu' }
|
||||
) #{translate('templates')}
|
||||
|
||||
// register link
|
||||
if hasFeature('registration-page')
|
||||
li.primary
|
||||
|
|
|
@ -2,7 +2,7 @@ extends ../../layout-marketing
|
|||
|
||||
block vars
|
||||
- var suppressFooter = true
|
||||
- var suppressCookieBanner = true
|
||||
- var suppressPugCookieBanner = true
|
||||
- var suppressSkipToContent = true
|
||||
|
||||
block content
|
||||
|
@ -29,8 +29,10 @@ block content
|
|||
input(type="hidden" name="templateVersionId" value=templateVersionId)
|
||||
input(type="hidden" name="templateName" value=name)
|
||||
input(type="hidden" name="compiler" value=compiler)
|
||||
input(type="hidden" name="imageName" value=imageName)
|
||||
if imageName
|
||||
input(type="hidden" name="imageName" value=imageName)
|
||||
input(type="hidden" name="mainFile" value=mainFile)
|
||||
input(type="hidden" name="language" value=language)
|
||||
if brandVariationId
|
||||
input(type="hidden" name="brandVariationId" value=brandVariationId)
|
||||
input(hidden type="submit")
|
||||
|
|
|
@ -7,7 +7,7 @@ block vars
|
|||
- var suppressNavbar = true
|
||||
- var suppressFooter = true
|
||||
- var suppressSkipToContent = true
|
||||
- var suppressCookieBanner = true
|
||||
- var suppressPugCookieBanner = true
|
||||
- metadata.robotsNoindexNofollow = true
|
||||
|
||||
block content
|
||||
|
|
|
@ -7,6 +7,7 @@ block vars
|
|||
- const suppressNavContentLinks = true
|
||||
- const suppressNavbar = true
|
||||
- const suppressFooter = true
|
||||
- const suppressPugCookieBanner = true
|
||||
|
||||
block append meta
|
||||
meta(
|
||||
|
|
|
@ -5,7 +5,7 @@ block entrypointVar
|
|||
|
||||
block vars
|
||||
- var suppressFooter = true
|
||||
- var suppressCookieBanner = true
|
||||
- var suppressPugCookieBanner = true
|
||||
- var suppressSkipToContent = true
|
||||
|
||||
block append meta
|
||||
|
|
|
@ -5,7 +5,7 @@ block entrypointVar
|
|||
|
||||
block vars
|
||||
- var suppressFooter = true
|
||||
- var suppressCookieBanner = true
|
||||
- var suppressPugCookieBanner = true
|
||||
- var suppressSkipToContent = true
|
||||
|
||||
block append meta
|
||||
|
|
18
services/web/app/views/template_gallery/template-gallery.pug
Normal file
18
services/web/app/views/template_gallery/template-gallery.pug
Normal file
|
@ -0,0 +1,18 @@
|
|||
extends ../layout-react
|
||||
|
||||
block entrypointVar
|
||||
- entrypoint = 'pages/template-gallery'
|
||||
|
||||
block vars
|
||||
block vars
|
||||
- const suppressNavContentLinks = true
|
||||
- const suppressNavbar = true
|
||||
- const suppressFooter = true
|
||||
- bootstrap5PageStatus = 'enabled' // One of 'disabled', 'enabled', and 'queryStringOnly'
|
||||
- isWebsiteRedesign = false
|
||||
|
||||
block append meta
|
||||
meta(name="ol-templateCategory" data-type="string" content=category)
|
||||
|
||||
block content
|
||||
#template-gallery-root
|
20
services/web/app/views/template_gallery/template.pug
Normal file
20
services/web/app/views/template_gallery/template.pug
Normal file
|
@ -0,0 +1,20 @@
|
|||
extends ../layout-react
|
||||
|
||||
block entrypointVar
|
||||
- entrypoint = 'pages/template'
|
||||
|
||||
block vars
|
||||
- const suppressNavContentLinks = true
|
||||
- const suppressNavbar = true
|
||||
- const suppressFooter = true
|
||||
- bootstrap5PageStatus = 'enabled' // One of 'disabled', 'enabled', and 'queryStringOnly'
|
||||
- isWebsiteRedesign = false
|
||||
|
||||
block append meta
|
||||
meta(name="ol-template" data-type="json" content=template)
|
||||
meta(name="ol-languages" data-type="json" content=languages)
|
||||
meta(name="ol-userIsAdmin" data-type="boolean" content=hasAdminAccess())
|
||||
|
||||
block content
|
||||
#template-root
|
||||
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue