Compare commits

...

109 Commits

Author SHA1 Message Date
Giò Diani 43fe5b7931 typo 2025-01-19 14:23:02 +01:00
Giò Diani bbb3c71d79 remove deleted library from API 2025-01-19 14:17:57 +01:00
Giò Diani b616c27173 Merge remote-tracking branch 'origin/main' 2025-01-19 14:12:59 +01:00
Giò Diani 75bd923038 Vereinheitlichung API 2025-01-19 14:12:36 +01:00
mmaurostoffel e758e064d0 Unbenutzer ETL Prozess gelöscht 2025-01-19 13:14:35 +01:00
Giò Diani e67636dbd6 Beschriftung 2025-01-19 12:44:52 +01:00
Giò Diani acf1989576 Predition charts symbols ausblenden, Popovers überarbeitet. 2025-01-19 12:42:04 +01:00
mmaurostoffel cdb92ac50f Aufräumen des etl_property_neighbours
Kommentare gelöscht und haversineFormel angepasst, dass sie gleich wie in der Quelle ist
2025-01-19 11:54:22 +01:00
Giò Diani 41f8c89178 really basic mobile view 2025-01-19 11:27:21 +01:00
Giò Diani 5664e5130e reactivate cache of webapp 2025-01-18 23:59:19 +01:00
Giò Diani 107f8c46b9 Navi Beschriftung 2025-01-18 23:46:07 +01:00
Giò Diani bc27afe05a prettier navi 2025-01-18 23:10:53 +01:00
mmaurostoffel 7663791f33 ungenutzte Datei etl_pipelines.py gelöscht 2025-01-18 22:43:49 +01:00
Giò Diani 3d0d45e6bc aktualisierte diagramme 2025-01-18 22:09:12 +01:00
Giò Diani c7988e77b8 fix missing name 2025-01-18 18:41:21 +01:00
Giò Diani 5ad31709a9 refactor dashboard 2025-01-18 17:31:31 +01:00
Giò Diani cd2d211259 documentation for api 2025-01-18 15:39:29 +01:00
Giò Diani e4e05b4788 refactoring; more consistent naming for API endpoint and variables. 2025-01-17 23:46:22 +01:00
Giò Diani e0c8b3eb1b Beschriftungen 2025-01-17 19:09:16 +01:00
mmaurostoffel 2560d43c3f fixed calculation of monthly data, closes #18 2025-01-17 17:51:27 +01:00
Giò Diani 4e233da745 fix error in tooltip of moving average chart 2025-01-17 17:37:01 +01:00
Giò Diani 7497271ac8 remove unused modul causing error 2025-01-17 17:30:30 +01:00
mmaurostoffel 5ffc222430 movAverage überarbeitet, closes #17
erster Monat mit wird nun auch angezeigt
"Nullstelle" zu Beginn entfernt
2025-01-17 17:01:29 +01:00
mmaurostoffel 468ad94430 Mauro Ordner gelöscht closes #16 2025-01-17 15:27:25 +01:00
mmaurostoffel 5b97c7ead2 Merge branch 'main' of https://gitea.fhgr.ch/stoffelmauro/ConsultancyProject_2_ETL 2025-01-17 15:25:28 +01:00
mmaurostoffel 9fb7fb2f82 Für alle weiteren SQL Queries die Bedingung das calendar not null sein darf eingefügt 2025-01-17 15:25:22 +01:00
Giò Diani d598f9d861 Lücken in linechart verbinden 2025-01-17 15:25:03 +01:00
Giò Diani e4462b0cfa Fehlermeldung, wenn keine Daten für Property vorhanden. 2025-01-17 15:13:41 +01:00
Giò Diani 0d3bc7acd3 don't include null values 2025-01-17 14:54:25 +01:00
Giò Diani 23bec87af6 Achsenbeschriftungen 2025-01-17 13:49:43 +01:00
Giò Diani 142046b1c6 Achsenbeschriftungen und grid Heatmap 2025-01-17 13:22:05 +01:00
Giò Diani 639533dda0 Rahmen um Heatmap 2025-01-17 13:02:54 +01:00
Giò Diani 4b10406eb7 Aufräumen etl modul; pixi install erneut ausführen, damit die Pfade angepasst werden! 2025-01-17 12:53:44 +01:00
Giò Diani 968f706218 Update README 2025-01-17 12:52:31 +01:00
Giò Diani c66f7c476d Beschriftung MovAvg Chart, Region bei Property im Menu anzeigen. 2025-01-17 12:51:45 +01:00
mmaurostoffel c3ab7d8e2f global for movingAverage implemented 2025-01-15 21:27:53 +01:00
mmaurostoffel 7e3862a578 global Extractions für region capacities monthly and weekdays eingefügt, closes #15 2025-01-15 20:44:16 +01:00
Giò Diani 0733709366 diagramm etl 2025-01-15 17:48:12 +01:00
Giò Diani 3290c1cce3 some polishing 2025-01-15 16:57:18 +01:00
Giò Diani 959b84d1e1 my fault fixes #14 2025-01-15 14:31:07 +01:00
Giò Diani a5a21fb925 Überarbeitung Dashboard 2025-01-14 22:11:31 +01:00
Giò Diani 8bef4b9621 added simple caching for etl 2025-01-14 19:56:15 +01:00
mmaurostoffel d436c5d892 added missing logic to etl_region_movAverage 2025-01-13 23:06:42 +01:00
Giò Diani cd66207bc7 Prediction Charts 2025-01-13 22:50:03 +01:00
Giò Diani 18a672a5de bugfix unnecessary parentheses broke api 2025-01-13 19:22:54 +01:00
mmaurostoffel 3d7d5bbbe3 etl_region_capacities_monthly eingefügt closes #10 2025-01-13 18:02:19 +01:00
mmaurostoffel d8d2d1e757 Werte von 0-1 zu 0-100 angepasst closes #12 2025-01-13 17:17:46 +01:00
mmaurostoffel 4487932f1b Merge branch 'main' of https://gitea.fhgr.ch/stoffelmauro/ConsultancyProject_2_ETL 2025-01-13 17:12:15 +01:00
mmaurostoffel fcd7ca34ad closes #11 2025-01-13 17:08:37 +01:00
Giò Diani ebcd647a2f Einige Anpassungen am Dashboard. Region Ansicht. 2025-01-13 17:06:54 +01:00
Giò Diani a3121bf58e navigation, regions 2025-01-12 20:55:46 +01:00
Giò Diani 50ea3f1bd8 fix module not found error (matplotlib not needed) 2025-01-12 20:44:05 +01:00
mmaurostoffel af1c2301a9 Added Enpoint for movAvg 2025-01-12 20:38:16 +01:00
mmaurostoffel a571c8c40f Merge branch 'main' of https://gitea.fhgr.ch/stoffelmauro/ConsultancyProject_2_ETL 2025-01-12 20:30:03 +01:00
mmaurostoffel b23879b6d3 First Version of etl_region_movAverage.py eingefügt 2025-01-12 20:27:33 +01:00
Giò Diani 0250221d96 implements regions base endpoint #9 2025-01-12 20:16:20 +01:00
mmaurostoffel f31c23ea51 Merge branch 'main' of https://gitea.fhgr.ch/stoffelmauro/ConsultancyProject_2_ETL 2025-01-12 16:53:33 +01:00
mmaurostoffel c059890ba7 singleScrape_of_region eingefügt 2025-01-12 16:53:31 +01:00
Giò Diani 992e299829 Überarbeitung Property 2025-01-12 16:49:29 +01:00
Giò Diani 67c0d85213 Betrifft #7. Möglicher fix, bitte Resultat kontrollieren. Das Problem lag m.E. darin, dass durch das hin und her zwischen Listen und DataFrame die Typisierungen der Werte verloren gehen, weshalb es dann auch entsprechenden Fehler schmeisste. 2025-01-12 11:56:33 +01:00
mmaurostoffel e176d1e73f Bugfix: Testfunktion aus etl_region_comparison gelöscht
sry
2025-01-12 11:28:53 +01:00
mmaurostoffel f114eb7f5a strict=False hinzugefügt wie in der Fehlermeldung vorgeschlagen 2025-01-12 11:20:39 +01:00
Giò Diani 99a112df24 longer timeouts 2025-01-11 21:54:26 +01:00
Giò Diani 2013d2b440 Wiederinstandsetzung Heatmap @stoffelmauro musste dazu etwas in der API anpassen. 2025-01-11 20:52:02 +01:00
mmaurostoffel 67382003ca closes #7: etl_region_capacities erstellt
!! Wie im Issue beschrieben wurde etl_region_capacities zu etl_region_properties_capacities angepasst und die Endpoints ebenfalls.!!

!!Die Abfrage der globalen Daten ist implementiert und funktioniert, braucht aber recht lange!!
2025-01-11 17:33:50 +01:00
Giò Diani 774e30c945 fix daily chart 2025-01-09 18:49:09 +01:00
Giò Diani 3b935a4d20 Nachbaren in Popup 2025-01-09 18:34:20 +01:00
mmaurostoffel 638d835d3b closes #5
Test to try out the closing feature
2025-01-09 18:26:01 +01:00
mmaurostoffel cb6935b60c updated the output of the etl_property_neighbour.py
closes issue #5
2025-01-09 18:22:55 +01:00
mmaurostoffel 60a3d7d9b3 Little fixes for weekdays
#6
2025-01-09 17:40:58 +01:00
mmaurostoffel 65b63d1326 Sortierung für etl_property_capacities_monthly eingefügt 2025-01-09 15:59:16 +01:00
mmaurostoffel a6cbe3bc29 update des etl_capacities_weekdays.py 2025-01-09 15:07:38 +01:00
mmaurostoffel 2508b34ceb bugfix: falscher name 2025-01-09 14:49:03 +01:00
mmaurostoffel cc71cbba2d Endpoint für etl_property_neighbours hinzugefügt 2025-01-09 14:48:33 +01:00
Giò Diani 258f1e4df6 Anzeige Auslastung p. Monat bei Properties im Dashboard. 2025-01-08 22:02:33 +01:00
mmaurostoffel 7884febe53 issue 5 resolved
#3
Ausgabeforma:
{ids: [84, 43...44], lat:[...], lon[...]}
2025-01-07 20:20:48 +01:00
mmaurostoffel 42dc14021f etl_Property_capacities_weekdays.py eingefügt
Abfragemöglichkeit für die Wochentage eingefügt
2025-01-06 19:42:49 +01:00
Giò Diani f5a2b16721 Fehlende Anführungszeichen hinzugefügt resolves #3 2025-01-05 21:45:23 +01:00
mmaurostoffel d9cae3d0ab Issue 3 fast fertig
#3

Der Issue ist soweit bereit, es gibt aber noch das Problem, dass das ScrapeDate nicht als Datum sondern asl Integer interpretiert wird im database.py. Deshalb ist es im Moment als konstante implementiert
2025-01-05 21:23:10 +01:00
mmaurostoffel 8bcc1c57b5 Gitea Issue 1 Beispiel 2
#1
etl_region_capacities_comparison eingefügt
2025-01-05 17:25:29 +01:00
mmaurostoffel 03e78a4105 Issue 1 Beispiel 1 resolved
#1

Globale region capacities eingefügt: Vorsicht! lange Ladezeit
2025-01-05 16:12:16 +01:00
mmaurostoffel 2a9ef9d991 Merge branch 'main' of https://gitea.fhgr.ch/stoffelmauro/ConsultancyProject_2_ETL 2025-01-05 15:51:27 +01:00
mmaurostoffel 8fcaf2a6f7 Gitea Issue 2 resolved
#2

etl_region_capacities.py: neues Output Format = [datum, prop_id, capacity]
2025-01-05 15:51:19 +01:00
Giò Diani 8655255782 info button, dashboard startseite 2025-01-05 13:26:51 +01:00
mmaurostoffel 281d9d3f5a Merge branch 'main' of https://gitea.fhgr.ch/stoffelmauro/ConsultancyProject_2_ETL 2025-01-05 13:19:45 +01:00
mmaurostoffel c68e6f54bd cleanup commit 2025-01-05 13:19:43 +01:00
Giò Diani 32d162c7c5 Überarbeitung Detailansicht. 2025-01-04 18:16:12 +01:00
Giò Diani 466d3168c4 Added caching (adjust setting in .env accordingly; CACHE_STORE=file) 2025-01-03 16:44:17 +01:00
Giò Diani 5a2cc96a95 Achsenbeschrftungen, Farben 2025-01-03 16:25:30 +01:00
Giò Diani 640a5b2f9e Implement region capacity as test 2024-12-20 21:46:54 +01:00
Giò Diani f585a7a2aa fix fehlender import 2024-12-20 21:36:47 +01:00
mmaurostoffel 818d6fb5ec Merge branch 'main' of https://gitea.fhgr.ch/stoffelmauro/ConsultancyProject_2_ETL 2024-12-20 20:57:20 +01:00
mmaurostoffel a8b856b714 etl_region_capacities erstellt + database und api/main Anpassungen dafür 2024-12-20 20:57:10 +01:00
Giò Diani 0aa0f2345c documentation folder, c4 diagram 2024-12-20 17:38:08 +01:00
Giò Diani eb362d78ad Vorbereitung heatmap 2024-12-20 15:25:33 +01:00
Giò Diani 5f61911a69 More sensible default in env, more documentation how to install and run. 2024-12-20 15:03:22 +01:00
Giò Diani 66d048c70e Enhance coordinated view of property. 2024-12-20 12:20:49 +01:00
Giò Diani 63590d69ab fix wrong import path. 2024-12-20 10:56:10 +01:00
Giò Diani 47a5035787 dashboard einzelansicht trend auslastung. 2024-12-19 19:22:09 +01:00
mmaurostoffel 4b7067fb63 Merge branch 'main' of https://gitea.fhgr.ch/stoffelmauro/ConsultancyProject_2_ETL 2024-12-19 18:11:17 +01:00
mmaurostoffel eba2f0a265 Data Quality updated to include Regions and more information 2024-12-19 18:11:15 +01:00
Giò Diani ce46655003 Add some more description. 2024-12-18 20:10:11 +01:00
Giò Diani 233f3c475a Added README 2024-12-18 19:55:42 +01:00
Giò Diani a8543d619f Further dashboard development. 2024-12-18 19:52:06 +01:00
Giò Diani 1574edea88 First steps Dashboard. 2024-12-18 15:14:13 +01:00
mmaurostoffel a03ce3d647 Änderungen von bevor Monorepo übernommen 2024-12-18 15:11:23 +01:00
Giò Diani f4a927e125 refactor to monorepo, install laravel. 2024-12-18 10:14:56 +01:00
mmaurostoffel 125250a665 data_quality.py erstellt zur Visualisierung der Datenqualität 2024-12-11 01:01:52 +01:00
mmaurostoffel 338d3e9cc2 Untersuchung Vorbuchungszeit abgeschlossen 2024-11-28 16:10:53 +01:00
1445 changed files with 16999 additions and 164016 deletions

6
.gitignore vendored
View File

@ -23,6 +23,7 @@
*.ipr *.ipr
.idea/ .idea/
# eclipse project file # eclipse project file
.settings/ .settings/
.classpath .classpath
@ -65,3 +66,8 @@ env3.*/
# duckdb # duckdb
*.duckdb *.duckdb
# cache
*.obj
/src/mauro/dok/

6
README.md Normal file
View File

@ -0,0 +1,6 @@
# Consultancy 2
## Projektstruktur
- etl: Enthält den Programmcode, welcher die Daten aufbereitet und via REST-API zur Verfügung stellt.
- dashboard: Webapplikation zur Exploration und Visualisierung der Daten.

18
dashboard/.editorconfig Normal file
View File

@ -0,0 +1,18 @@
root = true
[*]
charset = utf-8
end_of_line = lf
indent_size = 4
indent_style = space
insert_final_newline = true
trim_trailing_whitespace = true
[*.md]
trim_trailing_whitespace = false
[*.{yml,yaml}]
indent_size = 2
[docker-compose.yml]
indent_size = 4

68
dashboard/.env.example Normal file
View File

@ -0,0 +1,68 @@
APP_NAME=Laravel
APP_ENV=local
APP_KEY=
APP_DEBUG=true
APP_TIMEZONE=UTC
APP_URL=http://localhost
APP_LOCALE=en
APP_FALLBACK_LOCALE=en
APP_FAKER_LOCALE=en_US
APP_MAINTENANCE_DRIVER=file
# APP_MAINTENANCE_STORE=database
PHP_CLI_SERVER_WORKERS=4
BCRYPT_ROUNDS=12
LOG_CHANNEL=stack
LOG_STACK=single
LOG_DEPRECATIONS_CHANNEL=null
LOG_LEVEL=debug
# DB_CONNECTION=sqlite
# DB_HOST=127.0.0.1
# DB_PORT=3306
# DB_DATABASE=laravel
# DB_USERNAME=root
# DB_PASSWORD=
SESSION_DRIVER=file
SESSION_LIFETIME=120
SESSION_ENCRYPT=false
SESSION_PATH=/
SESSION_DOMAIN=null
BROADCAST_CONNECTION=log
FILESYSTEM_DISK=local
QUEUE_CONNECTION=database
CACHE_STORE=file
CACHE_PREFIX=
MEMCACHED_HOST=127.0.0.1
REDIS_CLIENT=phpredis
REDIS_HOST=127.0.0.1
REDIS_PASSWORD=null
REDIS_PORT=6379
MAIL_MAILER=log
MAIL_SCHEME=null
MAIL_HOST=127.0.0.1
MAIL_PORT=2525
MAIL_USERNAME=null
MAIL_PASSWORD=null
MAIL_FROM_ADDRESS="hello@example.com"
MAIL_FROM_NAME="${APP_NAME}"
AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=
AWS_DEFAULT_REGION=us-east-1
AWS_BUCKET=
AWS_USE_PATH_STYLE_ENDPOINT=false
VITE_APP_NAME="${APP_NAME}"
FASTAPI_URI=http://localhost:8080

11
dashboard/.gitattributes vendored Normal file
View File

@ -0,0 +1,11 @@
* text=auto eol=lf
*.blade.php diff=html
*.css diff=css
*.html diff=html
*.md diff=markdown
*.php diff=php
/.github export-ignore
CHANGELOG.md export-ignore
.styleci.yml export-ignore

23
dashboard/.gitignore vendored Normal file
View File

@ -0,0 +1,23 @@
/.phpunit.cache
/node_modules
/public/build
/public/hot
/public/storage
/storage/*.key
/storage/pail
/vendor
.env
.env.backup
.env.production
.phpactor.json
.phpunit.result.cache
Homestead.json
Homestead.yaml
auth.json
npm-debug.log
yarn-error.log
/.fleet
/.idea
/.nova
/.vscode
/.zed

16
dashboard/README.md Normal file
View File

@ -0,0 +1,16 @@
# Install
## Prerequisites
- In order to run this project please install all required software according to the laravel documentation: https://laravel.com/docs/11.x#installing-php
## Configuration & installation
- Make a copy of the .env.example to .env
- Run the following commands:
```bash
composer install && php artisan key:generate && npm i
```
# Run server
```bash
composer run dev
```

115
dashboard/app/Api.php Normal file
View File

@ -0,0 +1,115 @@
<?php
namespace App;
use Illuminate\Support\Facades\Cache;
use Illuminate\Support\Facades\Http;
/*
* Class contains methods which make calls to the API.
* Successfull calls get cached.
*/
class Api
{
public static function get(string $path, string $query = ''): ?array
{
$endpoint = env('FASTAPI_URI');
$request = $endpoint.$path;
// load from cache if available
if (Cache::has($request)) {
return Cache::get($request);
}
// Set timeout to .5h
$get = Http::timeout(1800)->get($request);
// return result and cache it
if($get->successful()){
$result = $get->json();
Cache::put($request, $result);
return $result;
}
return null;
}
public static function propertiesGrowth(): mixed
{
return self::get('/properties/growth');
}
public static function propertiesGeo(): mixed
{
return self::get('/properties/geo');
}
public static function propertyExtractions(int $id): mixed
{
return self::get("/properties/{$id}/extractions");
}
public static function propertyCapacities(int $id): mixed
{
return self::get("/properties/{$id}/capacities");
}
public static function propertyBase(int $id): mixed
{
return self::get("/properties/{$id}/base");
}
public static function propertyCapacitiesMonthly(int $id, string $date): mixed
{
return self::get("/properties/{$id}/capacities/monthly/{$date}");
}
public static function propertyCapacitiesDaily(int $id, string $date): mixed
{
return self::get("/properties/{$id}/capacities/daily/{$date}");
}
public static function propertyNeighbours(int $id): mixed
{
return self::get("/properties/{$id}/neighbours");
}
public static function regions(): mixed
{
return self::get('/regions');
}
public static function regionBase(int $id): mixed
{
return self::get("/regions/{$id}/base");
}
public static function regionPropertiesCapacities(int $id): mixed
{
return self::get("/regions/{$id}/properties/capacities");
}
public static function regionCapacitiesMonthly(int $id, string $date): mixed
{
return self::get("/regions/{$id}/capacities/monthly/{$date}");
}
public static function regionCapacitiesDaily(int $id, string $date): mixed
{
return self::get("/regions/{$id}/capacities/daily/{$date}");
}
public static function regionCapacities(int $id): mixed
{
return self::get("/regions/{$id}/capacities");
}
public static function regionMovingAverage(int $id, string $date): mixed
{
return self::get("/regions/{$id}/moving-average/{$date}");
}
}

12
dashboard/app/Chart.php Normal file
View File

@ -0,0 +1,12 @@
<?php
namespace App;
class Chart
{
public static function colors(int $count = 5){
$colors = ['#9ebcda','#8c96c6','#88419d','#810f7c','#4d004b'];
return json_encode($colors);
}
}

View File

@ -0,0 +1,8 @@
<?php
namespace App\Http\Controllers;
abstract class Controller
{
//
}

View File

@ -0,0 +1,48 @@
<?php
namespace App\Models;
// use Illuminate\Contracts\Auth\MustVerifyEmail;
use Illuminate\Database\Eloquent\Factories\HasFactory;
use Illuminate\Foundation\Auth\User as Authenticatable;
use Illuminate\Notifications\Notifiable;
class User extends Authenticatable
{
/** @use HasFactory<\Database\Factories\UserFactory> */
use HasFactory, Notifiable;
/**
* The attributes that are mass assignable.
*
* @var list<string>
*/
protected $fillable = [
'name',
'email',
'password',
];
/**
* The attributes that should be hidden for serialization.
*
* @var list<string>
*/
protected $hidden = [
'password',
'remember_token',
];
/**
* Get the attributes that should be cast.
*
* @return array<string, string>
*/
protected function casts(): array
{
return [
'email_verified_at' => 'datetime',
'password' => 'hashed',
];
}
}

View File

@ -0,0 +1,24 @@
<?php
namespace App\Providers;
use Illuminate\Support\ServiceProvider;
class AppServiceProvider extends ServiceProvider
{
/**
* Register any application services.
*/
public function register(): void
{
//
}
/**
* Bootstrap any application services.
*/
public function boot(): void
{
//
}
}

15
dashboard/artisan Executable file
View File

@ -0,0 +1,15 @@
#!/usr/bin/env php
<?php
use Symfony\Component\Console\Input\ArgvInput;
define('LARAVEL_START', microtime(true));
// Register the Composer autoloader...
require __DIR__.'/vendor/autoload.php';
// Bootstrap Laravel and handle the command...
$status = (require_once __DIR__.'/bootstrap/app.php')
->handleCommand(new ArgvInput);
exit($status);

View File

@ -0,0 +1,18 @@
<?php
use Illuminate\Foundation\Application;
use Illuminate\Foundation\Configuration\Exceptions;
use Illuminate\Foundation\Configuration\Middleware;
return Application::configure(basePath: dirname(__DIR__))
->withRouting(
web: __DIR__.'/../routes/web.php',
commands: __DIR__.'/../routes/console.php',
health: '/up',
)
->withMiddleware(function (Middleware $middleware) {
//
})
->withExceptions(function (Exceptions $exceptions) {
//
})->create();

2
dashboard/bootstrap/cache/.gitignore vendored Normal file
View File

@ -0,0 +1,2 @@
*
!.gitignore

View File

@ -0,0 +1,5 @@
<?php
return [
App\Providers\AppServiceProvider::class,
];

74
dashboard/composer.json Normal file
View File

@ -0,0 +1,74 @@
{
"$schema": "https://getcomposer.org/schema.json",
"name": "laravel/laravel",
"type": "project",
"description": "The skeleton application for the Laravel framework.",
"keywords": [
"laravel",
"framework"
],
"license": "MIT",
"require": {
"php": "^8.2",
"laravel/framework": "^11.31",
"laravel/tinker": "^2.9"
},
"require-dev": {
"fakerphp/faker": "^1.23",
"laravel/pail": "^1.1",
"laravel/pint": "^1.13",
"laravel/sail": "^1.26",
"mockery/mockery": "^1.6",
"nunomaduro/collision": "^8.1",
"phpunit/phpunit": "^11.0.1"
},
"autoload": {
"psr-4": {
"App\\": "app/",
"Database\\Factories\\": "database/factories/",
"Database\\Seeders\\": "database/seeders/"
}
},
"autoload-dev": {
"psr-4": {
"Tests\\": "tests/"
}
},
"scripts": {
"post-autoload-dump": [
"Illuminate\\Foundation\\ComposerScripts::postAutoloadDump",
"@php artisan package:discover --ansi"
],
"post-update-cmd": [
"@php artisan vendor:publish --tag=laravel-assets --ansi --force"
],
"post-root-package-install": [
"@php -r \"file_exists('.env') || copy('.env.example', '.env');\""
],
"post-create-project-cmd": [
"@php artisan key:generate --ansi",
"@php -r \"file_exists('database/database.sqlite') || touch('database/database.sqlite');\"",
"@php artisan migrate --graceful --ansi"
],
"dev": [
"Composer\\Config::disableProcessTimeout",
"npx concurrently -c \"#93c5fd,#c4b5fd,#fb7185,#fdba74\" \"php artisan serve\" \"php artisan queue:listen --tries=1\" \"php artisan pail --timeout=0\" \"npm run dev\" --names=server,queue,logs,vite"
]
},
"extra": {
"laravel": {
"dont-discover": []
}
},
"config": {
"optimize-autoloader": true,
"preferred-install": "dist",
"sort-packages": true,
"allow-plugins": {
"pestphp/pest-plugin": true,
"php-http/discovery": true
}
},
"minimum-stability": "stable",
"prefer-stable": true
}

8083
dashboard/composer.lock generated Normal file

File diff suppressed because it is too large Load Diff

126
dashboard/config/app.php Normal file
View File

@ -0,0 +1,126 @@
<?php
return [
/*
|--------------------------------------------------------------------------
| Application Name
|--------------------------------------------------------------------------
|
| This value is the name of your application, which will be used when the
| framework needs to place the application's name in a notification or
| other UI elements where an application name needs to be displayed.
|
*/
'name' => env('APP_NAME', 'Laravel'),
/*
|--------------------------------------------------------------------------
| Application Environment
|--------------------------------------------------------------------------
|
| This value determines the "environment" your application is currently
| running in. This may determine how you prefer to configure various
| services the application utilizes. Set this in your ".env" file.
|
*/
'env' => env('APP_ENV', 'production'),
/*
|--------------------------------------------------------------------------
| Application Debug Mode
|--------------------------------------------------------------------------
|
| When your application is in debug mode, detailed error messages with
| stack traces will be shown on every error that occurs within your
| application. If disabled, a simple generic error page is shown.
|
*/
'debug' => (bool) env('APP_DEBUG', false),
/*
|--------------------------------------------------------------------------
| Application URL
|--------------------------------------------------------------------------
|
| This URL is used by the console to properly generate URLs when using
| the Artisan command line tool. You should set this to the root of
| the application so that it's available within Artisan commands.
|
*/
'url' => env('APP_URL', 'http://localhost'),
/*
|--------------------------------------------------------------------------
| Application Timezone
|--------------------------------------------------------------------------
|
| Here you may specify the default timezone for your application, which
| will be used by the PHP date and date-time functions. The timezone
| is set to "UTC" by default as it is suitable for most use cases.
|
*/
'timezone' => env('APP_TIMEZONE', 'UTC'),
/*
|--------------------------------------------------------------------------
| Application Locale Configuration
|--------------------------------------------------------------------------
|
| The application locale determines the default locale that will be used
| by Laravel's translation / localization methods. This option can be
| set to any locale for which you plan to have translation strings.
|
*/
'locale' => env('APP_LOCALE', 'en'),
'fallback_locale' => env('APP_FALLBACK_LOCALE', 'en'),
'faker_locale' => env('APP_FAKER_LOCALE', 'en_US'),
/*
|--------------------------------------------------------------------------
| Encryption Key
|--------------------------------------------------------------------------
|
| This key is utilized by Laravel's encryption services and should be set
| to a random, 32 character string to ensure that all encrypted values
| are secure. You should do this prior to deploying the application.
|
*/
'cipher' => 'AES-256-CBC',
'key' => env('APP_KEY'),
'previous_keys' => [
...array_filter(
explode(',', env('APP_PREVIOUS_KEYS', ''))
),
],
/*
|--------------------------------------------------------------------------
| Maintenance Mode Driver
|--------------------------------------------------------------------------
|
| These configuration options determine the driver used to determine and
| manage Laravel's "maintenance mode" status. The "cache" driver will
| allow maintenance mode to be controlled across multiple machines.
|
| Supported drivers: "file", "cache"
|
*/
'maintenance' => [
'driver' => env('APP_MAINTENANCE_DRIVER', 'file'),
'store' => env('APP_MAINTENANCE_STORE', 'database'),
],
];

115
dashboard/config/auth.php Normal file
View File

@ -0,0 +1,115 @@
<?php
return [
/*
|--------------------------------------------------------------------------
| Authentication Defaults
|--------------------------------------------------------------------------
|
| This option defines the default authentication "guard" and password
| reset "broker" for your application. You may change these values
| as required, but they're a perfect start for most applications.
|
*/
'defaults' => [
'guard' => env('AUTH_GUARD', 'web'),
'passwords' => env('AUTH_PASSWORD_BROKER', 'users'),
],
/*
|--------------------------------------------------------------------------
| Authentication Guards
|--------------------------------------------------------------------------
|
| Next, you may define every authentication guard for your application.
| Of course, a great default configuration has been defined for you
| which utilizes session storage plus the Eloquent user provider.
|
| All authentication guards have a user provider, which defines how the
| users are actually retrieved out of your database or other storage
| system used by the application. Typically, Eloquent is utilized.
|
| Supported: "session"
|
*/
'guards' => [
'web' => [
'driver' => 'session',
'provider' => 'users',
],
],
/*
|--------------------------------------------------------------------------
| User Providers
|--------------------------------------------------------------------------
|
| All authentication guards have a user provider, which defines how the
| users are actually retrieved out of your database or other storage
| system used by the application. Typically, Eloquent is utilized.
|
| If you have multiple user tables or models you may configure multiple
| providers to represent the model / table. These providers may then
| be assigned to any extra authentication guards you have defined.
|
| Supported: "database", "eloquent"
|
*/
'providers' => [
'users' => [
'driver' => 'eloquent',
'model' => env('AUTH_MODEL', App\Models\User::class),
],
// 'users' => [
// 'driver' => 'database',
// 'table' => 'users',
// ],
],
/*
|--------------------------------------------------------------------------
| Resetting Passwords
|--------------------------------------------------------------------------
|
| These configuration options specify the behavior of Laravel's password
| reset functionality, including the table utilized for token storage
| and the user provider that is invoked to actually retrieve users.
|
| The expiry time is the number of minutes that each reset token will be
| considered valid. This security feature keeps tokens short-lived so
| they have less time to be guessed. You may change this as needed.
|
| The throttle setting is the number of seconds a user must wait before
| generating more password reset tokens. This prevents the user from
| quickly generating a very large amount of password reset tokens.
|
*/
'passwords' => [
'users' => [
'provider' => 'users',
'table' => env('AUTH_PASSWORD_RESET_TOKEN_TABLE', 'password_reset_tokens'),
'expire' => 60,
'throttle' => 60,
],
],
/*
|--------------------------------------------------------------------------
| Password Confirmation Timeout
|--------------------------------------------------------------------------
|
| Here you may define the amount of seconds before a password confirmation
| window expires and users are asked to re-enter their password via the
| confirmation screen. By default, the timeout lasts for three hours.
|
*/
'password_timeout' => env('AUTH_PASSWORD_TIMEOUT', 10800),
];

108
dashboard/config/cache.php Normal file
View File

@ -0,0 +1,108 @@
<?php
use Illuminate\Support\Str;
return [
/*
|--------------------------------------------------------------------------
| Default Cache Store
|--------------------------------------------------------------------------
|
| This option controls the default cache store that will be used by the
| framework. This connection is utilized if another isn't explicitly
| specified when running a cache operation inside the application.
|
*/
'default' => env('CACHE_STORE', 'database'),
/*
|--------------------------------------------------------------------------
| Cache Stores
|--------------------------------------------------------------------------
|
| Here you may define all of the cache "stores" for your application as
| well as their drivers. You may even define multiple stores for the
| same cache driver to group types of items stored in your caches.
|
| Supported drivers: "array", "database", "file", "memcached",
| "redis", "dynamodb", "octane", "null"
|
*/
'stores' => [
'array' => [
'driver' => 'array',
'serialize' => false,
],
'database' => [
'driver' => 'database',
'connection' => env('DB_CACHE_CONNECTION'),
'table' => env('DB_CACHE_TABLE', 'cache'),
'lock_connection' => env('DB_CACHE_LOCK_CONNECTION'),
'lock_table' => env('DB_CACHE_LOCK_TABLE'),
],
'file' => [
'driver' => 'file',
'path' => storage_path('framework/cache/data'),
'lock_path' => storage_path('framework/cache/data'),
],
'memcached' => [
'driver' => 'memcached',
'persistent_id' => env('MEMCACHED_PERSISTENT_ID'),
'sasl' => [
env('MEMCACHED_USERNAME'),
env('MEMCACHED_PASSWORD'),
],
'options' => [
// Memcached::OPT_CONNECT_TIMEOUT => 2000,
],
'servers' => [
[
'host' => env('MEMCACHED_HOST', '127.0.0.1'),
'port' => env('MEMCACHED_PORT', 11211),
'weight' => 100,
],
],
],
'redis' => [
'driver' => 'redis',
'connection' => env('REDIS_CACHE_CONNECTION', 'cache'),
'lock_connection' => env('REDIS_CACHE_LOCK_CONNECTION', 'default'),
],
'dynamodb' => [
'driver' => 'dynamodb',
'key' => env('AWS_ACCESS_KEY_ID'),
'secret' => env('AWS_SECRET_ACCESS_KEY'),
'region' => env('AWS_DEFAULT_REGION', 'us-east-1'),
'table' => env('DYNAMODB_CACHE_TABLE', 'cache'),
'endpoint' => env('DYNAMODB_ENDPOINT'),
],
'octane' => [
'driver' => 'octane',
],
],
/*
|--------------------------------------------------------------------------
| Cache Key Prefix
|--------------------------------------------------------------------------
|
| When utilizing the APC, database, memcached, Redis, and DynamoDB cache
| stores, there might be other applications using the same cache. For
| that reason, you may prefix every cache key to avoid collisions.
|
*/
'prefix' => env('CACHE_PREFIX', Str::slug(env('APP_NAME', 'laravel'), '_').'_cache_'),
];

View File

@ -0,0 +1,173 @@
<?php
use Illuminate\Support\Str;
return [
/*
|--------------------------------------------------------------------------
| Default Database Connection Name
|--------------------------------------------------------------------------
|
| Here you may specify which of the database connections below you wish
| to use as your default connection for database operations. This is
| the connection which will be utilized unless another connection
| is explicitly specified when you execute a query / statement.
|
*/
'default' => env('DB_CONNECTION', 'sqlite'),
/*
|--------------------------------------------------------------------------
| Database Connections
|--------------------------------------------------------------------------
|
| Below are all of the database connections defined for your application.
| An example configuration is provided for each database system which
| is supported by Laravel. You're free to add / remove connections.
|
*/
'connections' => [
'sqlite' => [
'driver' => 'sqlite',
'url' => env('DB_URL'),
'database' => env('DB_DATABASE', database_path('database.sqlite')),
'prefix' => '',
'foreign_key_constraints' => env('DB_FOREIGN_KEYS', true),
'busy_timeout' => null,
'journal_mode' => null,
'synchronous' => null,
],
'mysql' => [
'driver' => 'mysql',
'url' => env('DB_URL'),
'host' => env('DB_HOST', '127.0.0.1'),
'port' => env('DB_PORT', '3306'),
'database' => env('DB_DATABASE', 'laravel'),
'username' => env('DB_USERNAME', 'root'),
'password' => env('DB_PASSWORD', ''),
'unix_socket' => env('DB_SOCKET', ''),
'charset' => env('DB_CHARSET', 'utf8mb4'),
'collation' => env('DB_COLLATION', 'utf8mb4_unicode_ci'),
'prefix' => '',
'prefix_indexes' => true,
'strict' => true,
'engine' => null,
'options' => extension_loaded('pdo_mysql') ? array_filter([
PDO::MYSQL_ATTR_SSL_CA => env('MYSQL_ATTR_SSL_CA'),
]) : [],
],
'mariadb' => [
'driver' => 'mariadb',
'url' => env('DB_URL'),
'host' => env('DB_HOST', '127.0.0.1'),
'port' => env('DB_PORT', '3306'),
'database' => env('DB_DATABASE', 'laravel'),
'username' => env('DB_USERNAME', 'root'),
'password' => env('DB_PASSWORD', ''),
'unix_socket' => env('DB_SOCKET', ''),
'charset' => env('DB_CHARSET', 'utf8mb4'),
'collation' => env('DB_COLLATION', 'utf8mb4_unicode_ci'),
'prefix' => '',
'prefix_indexes' => true,
'strict' => true,
'engine' => null,
'options' => extension_loaded('pdo_mysql') ? array_filter([
PDO::MYSQL_ATTR_SSL_CA => env('MYSQL_ATTR_SSL_CA'),
]) : [],
],
'pgsql' => [
'driver' => 'pgsql',
'url' => env('DB_URL'),
'host' => env('DB_HOST', '127.0.0.1'),
'port' => env('DB_PORT', '5432'),
'database' => env('DB_DATABASE', 'laravel'),
'username' => env('DB_USERNAME', 'root'),
'password' => env('DB_PASSWORD', ''),
'charset' => env('DB_CHARSET', 'utf8'),
'prefix' => '',
'prefix_indexes' => true,
'search_path' => 'public',
'sslmode' => 'prefer',
],
'sqlsrv' => [
'driver' => 'sqlsrv',
'url' => env('DB_URL'),
'host' => env('DB_HOST', 'localhost'),
'port' => env('DB_PORT', '1433'),
'database' => env('DB_DATABASE', 'laravel'),
'username' => env('DB_USERNAME', 'root'),
'password' => env('DB_PASSWORD', ''),
'charset' => env('DB_CHARSET', 'utf8'),
'prefix' => '',
'prefix_indexes' => true,
// 'encrypt' => env('DB_ENCRYPT', 'yes'),
// 'trust_server_certificate' => env('DB_TRUST_SERVER_CERTIFICATE', 'false'),
],
],
/*
|--------------------------------------------------------------------------
| Migration Repository Table
|--------------------------------------------------------------------------
|
| This table keeps track of all the migrations that have already run for
| your application. Using this information, we can determine which of
| the migrations on disk haven't actually been run on the database.
|
*/
'migrations' => [
'table' => 'migrations',
'update_date_on_publish' => true,
],
/*
|--------------------------------------------------------------------------
| Redis Databases
|--------------------------------------------------------------------------
|
| Redis is an open source, fast, and advanced key-value store that also
| provides a richer body of commands than a typical key-value system
| such as Memcached. You may define your connection settings here.
|
*/
'redis' => [
'client' => env('REDIS_CLIENT', 'phpredis'),
'options' => [
'cluster' => env('REDIS_CLUSTER', 'redis'),
'prefix' => env('REDIS_PREFIX', Str::slug(env('APP_NAME', 'laravel'), '_').'_database_'),
],
'default' => [
'url' => env('REDIS_URL'),
'host' => env('REDIS_HOST', '127.0.0.1'),
'username' => env('REDIS_USERNAME'),
'password' => env('REDIS_PASSWORD'),
'port' => env('REDIS_PORT', '6379'),
'database' => env('REDIS_DB', '0'),
],
'cache' => [
'url' => env('REDIS_URL'),
'host' => env('REDIS_HOST', '127.0.0.1'),
'username' => env('REDIS_USERNAME'),
'password' => env('REDIS_PASSWORD'),
'port' => env('REDIS_PORT', '6379'),
'database' => env('REDIS_CACHE_DB', '1'),
],
],
];

View File

@ -0,0 +1,77 @@
<?php
return [
/*
|--------------------------------------------------------------------------
| Default Filesystem Disk
|--------------------------------------------------------------------------
|
| Here you may specify the default filesystem disk that should be used
| by the framework. The "local" disk, as well as a variety of cloud
| based disks are available to your application for file storage.
|
*/
'default' => env('FILESYSTEM_DISK', 'local'),
/*
|--------------------------------------------------------------------------
| Filesystem Disks
|--------------------------------------------------------------------------
|
| Below you may configure as many filesystem disks as necessary, and you
| may even configure multiple disks for the same driver. Examples for
| most supported storage drivers are configured here for reference.
|
| Supported drivers: "local", "ftp", "sftp", "s3"
|
*/
'disks' => [
'local' => [
'driver' => 'local',
'root' => storage_path('app/private'),
'serve' => true,
'throw' => false,
],
'public' => [
'driver' => 'local',
'root' => storage_path('app/public'),
'url' => env('APP_URL').'/storage',
'visibility' => 'public',
'throw' => false,
],
's3' => [
'driver' => 's3',
'key' => env('AWS_ACCESS_KEY_ID'),
'secret' => env('AWS_SECRET_ACCESS_KEY'),
'region' => env('AWS_DEFAULT_REGION'),
'bucket' => env('AWS_BUCKET'),
'url' => env('AWS_URL'),
'endpoint' => env('AWS_ENDPOINT'),
'use_path_style_endpoint' => env('AWS_USE_PATH_STYLE_ENDPOINT', false),
'throw' => false,
],
],
/*
|--------------------------------------------------------------------------
| Symbolic Links
|--------------------------------------------------------------------------
|
| Here you may configure the symbolic links that will be created when the
| `storage:link` Artisan command is executed. The array keys should be
| the locations of the links and the values should be their targets.
|
*/
'links' => [
public_path('storage') => storage_path('app/public'),
],
];

View File

@ -0,0 +1,132 @@
<?php
use Monolog\Handler\NullHandler;
use Monolog\Handler\StreamHandler;
use Monolog\Handler\SyslogUdpHandler;
use Monolog\Processor\PsrLogMessageProcessor;
return [
/*
|--------------------------------------------------------------------------
| Default Log Channel
|--------------------------------------------------------------------------
|
| This option defines the default log channel that is utilized to write
| messages to your logs. The value provided here should match one of
| the channels present in the list of "channels" configured below.
|
*/
'default' => env('LOG_CHANNEL', 'stack'),
/*
|--------------------------------------------------------------------------
| Deprecations Log Channel
|--------------------------------------------------------------------------
|
| This option controls the log channel that should be used to log warnings
| regarding deprecated PHP and library features. This allows you to get
| your application ready for upcoming major versions of dependencies.
|
*/
'deprecations' => [
'channel' => env('LOG_DEPRECATIONS_CHANNEL', 'null'),
'trace' => env('LOG_DEPRECATIONS_TRACE', false),
],
/*
|--------------------------------------------------------------------------
| Log Channels
|--------------------------------------------------------------------------
|
| Here you may configure the log channels for your application. Laravel
| utilizes the Monolog PHP logging library, which includes a variety
| of powerful log handlers and formatters that you're free to use.
|
| Available drivers: "single", "daily", "slack", "syslog",
| "errorlog", "monolog", "custom", "stack"
|
*/
'channels' => [
'stack' => [
'driver' => 'stack',
'channels' => explode(',', env('LOG_STACK', 'single')),
'ignore_exceptions' => false,
],
'single' => [
'driver' => 'single',
'path' => storage_path('logs/laravel.log'),
'level' => env('LOG_LEVEL', 'debug'),
'replace_placeholders' => true,
],
'daily' => [
'driver' => 'daily',
'path' => storage_path('logs/laravel.log'),
'level' => env('LOG_LEVEL', 'debug'),
'days' => env('LOG_DAILY_DAYS', 14),
'replace_placeholders' => true,
],
'slack' => [
'driver' => 'slack',
'url' => env('LOG_SLACK_WEBHOOK_URL'),
'username' => env('LOG_SLACK_USERNAME', 'Laravel Log'),
'emoji' => env('LOG_SLACK_EMOJI', ':boom:'),
'level' => env('LOG_LEVEL', 'critical'),
'replace_placeholders' => true,
],
'papertrail' => [
'driver' => 'monolog',
'level' => env('LOG_LEVEL', 'debug'),
'handler' => env('LOG_PAPERTRAIL_HANDLER', SyslogUdpHandler::class),
'handler_with' => [
'host' => env('PAPERTRAIL_URL'),
'port' => env('PAPERTRAIL_PORT'),
'connectionString' => 'tls://'.env('PAPERTRAIL_URL').':'.env('PAPERTRAIL_PORT'),
],
'processors' => [PsrLogMessageProcessor::class],
],
'stderr' => [
'driver' => 'monolog',
'level' => env('LOG_LEVEL', 'debug'),
'handler' => StreamHandler::class,
'formatter' => env('LOG_STDERR_FORMATTER'),
'with' => [
'stream' => 'php://stderr',
],
'processors' => [PsrLogMessageProcessor::class],
],
'syslog' => [
'driver' => 'syslog',
'level' => env('LOG_LEVEL', 'debug'),
'facility' => env('LOG_SYSLOG_FACILITY', LOG_USER),
'replace_placeholders' => true,
],
'errorlog' => [
'driver' => 'errorlog',
'level' => env('LOG_LEVEL', 'debug'),
'replace_placeholders' => true,
],
'null' => [
'driver' => 'monolog',
'handler' => NullHandler::class,
],
'emergency' => [
'path' => storage_path('logs/laravel.log'),
],
],
];

116
dashboard/config/mail.php Normal file
View File

@ -0,0 +1,116 @@
<?php
return [
/*
|--------------------------------------------------------------------------
| Default Mailer
|--------------------------------------------------------------------------
|
| This option controls the default mailer that is used to send all email
| messages unless another mailer is explicitly specified when sending
| the message. All additional mailers can be configured within the
| "mailers" array. Examples of each type of mailer are provided.
|
*/
'default' => env('MAIL_MAILER', 'log'),
/*
|--------------------------------------------------------------------------
| Mailer Configurations
|--------------------------------------------------------------------------
|
| Here you may configure all of the mailers used by your application plus
| their respective settings. Several examples have been configured for
| you and you are free to add your own as your application requires.
|
| Laravel supports a variety of mail "transport" drivers that can be used
| when delivering an email. You may specify which one you're using for
| your mailers below. You may also add additional mailers if needed.
|
| Supported: "smtp", "sendmail", "mailgun", "ses", "ses-v2",
| "postmark", "resend", "log", "array",
| "failover", "roundrobin"
|
*/
'mailers' => [
'smtp' => [
'transport' => 'smtp',
'scheme' => env('MAIL_SCHEME'),
'url' => env('MAIL_URL'),
'host' => env('MAIL_HOST', '127.0.0.1'),
'port' => env('MAIL_PORT', 2525),
'username' => env('MAIL_USERNAME'),
'password' => env('MAIL_PASSWORD'),
'timeout' => null,
'local_domain' => env('MAIL_EHLO_DOMAIN', parse_url(env('APP_URL', 'http://localhost'), PHP_URL_HOST)),
],
'ses' => [
'transport' => 'ses',
],
'postmark' => [
'transport' => 'postmark',
// 'message_stream_id' => env('POSTMARK_MESSAGE_STREAM_ID'),
// 'client' => [
// 'timeout' => 5,
// ],
],
'resend' => [
'transport' => 'resend',
],
'sendmail' => [
'transport' => 'sendmail',
'path' => env('MAIL_SENDMAIL_PATH', '/usr/sbin/sendmail -bs -i'),
],
'log' => [
'transport' => 'log',
'channel' => env('MAIL_LOG_CHANNEL'),
],
'array' => [
'transport' => 'array',
],
'failover' => [
'transport' => 'failover',
'mailers' => [
'smtp',
'log',
],
],
'roundrobin' => [
'transport' => 'roundrobin',
'mailers' => [
'ses',
'postmark',
],
],
],
/*
|--------------------------------------------------------------------------
| Global "From" Address
|--------------------------------------------------------------------------
|
| You may wish for all emails sent by your application to be sent from
| the same address. Here you may specify a name and address that is
| used globally for all emails that are sent by your application.
|
*/
'from' => [
'address' => env('MAIL_FROM_ADDRESS', 'hello@example.com'),
'name' => env('MAIL_FROM_NAME', 'Example'),
],
];

112
dashboard/config/queue.php Normal file
View File

@ -0,0 +1,112 @@
<?php
return [
/*
|--------------------------------------------------------------------------
| Default Queue Connection Name
|--------------------------------------------------------------------------
|
| Laravel's queue supports a variety of backends via a single, unified
| API, giving you convenient access to each backend using identical
| syntax for each. The default queue connection is defined below.
|
*/
'default' => env('QUEUE_CONNECTION', 'database'),
/*
|--------------------------------------------------------------------------
| Queue Connections
|--------------------------------------------------------------------------
|
| Here you may configure the connection options for every queue backend
| used by your application. An example configuration is provided for
| each backend supported by Laravel. You're also free to add more.
|
| Drivers: "sync", "database", "beanstalkd", "sqs", "redis", "null"
|
*/
'connections' => [
'sync' => [
'driver' => 'sync',
],
'database' => [
'driver' => 'database',
'connection' => env('DB_QUEUE_CONNECTION'),
'table' => env('DB_QUEUE_TABLE', 'jobs'),
'queue' => env('DB_QUEUE', 'default'),
'retry_after' => (int) env('DB_QUEUE_RETRY_AFTER', 90),
'after_commit' => false,
],
'beanstalkd' => [
'driver' => 'beanstalkd',
'host' => env('BEANSTALKD_QUEUE_HOST', 'localhost'),
'queue' => env('BEANSTALKD_QUEUE', 'default'),
'retry_after' => (int) env('BEANSTALKD_QUEUE_RETRY_AFTER', 90),
'block_for' => 0,
'after_commit' => false,
],
'sqs' => [
'driver' => 'sqs',
'key' => env('AWS_ACCESS_KEY_ID'),
'secret' => env('AWS_SECRET_ACCESS_KEY'),
'prefix' => env('SQS_PREFIX', 'https://sqs.us-east-1.amazonaws.com/your-account-id'),
'queue' => env('SQS_QUEUE', 'default'),
'suffix' => env('SQS_SUFFIX'),
'region' => env('AWS_DEFAULT_REGION', 'us-east-1'),
'after_commit' => false,
],
'redis' => [
'driver' => 'redis',
'connection' => env('REDIS_QUEUE_CONNECTION', 'default'),
'queue' => env('REDIS_QUEUE', 'default'),
'retry_after' => (int) env('REDIS_QUEUE_RETRY_AFTER', 90),
'block_for' => null,
'after_commit' => false,
],
],
/*
|--------------------------------------------------------------------------
| Job Batching
|--------------------------------------------------------------------------
|
| The following options configure the database and table that store job
| batching information. These options can be updated to any database
| connection and table which has been defined by your application.
|
*/
'batching' => [
'database' => env('DB_CONNECTION', 'sqlite'),
'table' => 'job_batches',
],
/*
|--------------------------------------------------------------------------
| Failed Queue Jobs
|--------------------------------------------------------------------------
|
| These options configure the behavior of failed queue job logging so you
| can control how and where failed jobs are stored. Laravel ships with
| support for storing failed jobs in a simple file or in a database.
|
| Supported drivers: "database-uuids", "dynamodb", "file", "null"
|
*/
'failed' => [
'driver' => env('QUEUE_FAILED_DRIVER', 'database-uuids'),
'database' => env('DB_CONNECTION', 'sqlite'),
'table' => 'failed_jobs',
],
];

View File

@ -0,0 +1,38 @@
<?php
return [
/*
|--------------------------------------------------------------------------
| Third Party Services
|--------------------------------------------------------------------------
|
| This file is for storing the credentials for third party services such
| as Mailgun, Postmark, AWS and more. This file provides the de facto
| location for this type of information, allowing packages to have
| a conventional file to locate the various service credentials.
|
*/
'postmark' => [
'token' => env('POSTMARK_TOKEN'),
],
'ses' => [
'key' => env('AWS_ACCESS_KEY_ID'),
'secret' => env('AWS_SECRET_ACCESS_KEY'),
'region' => env('AWS_DEFAULT_REGION', 'us-east-1'),
],
'resend' => [
'key' => env('RESEND_KEY'),
],
'slack' => [
'notifications' => [
'bot_user_oauth_token' => env('SLACK_BOT_USER_OAUTH_TOKEN'),
'channel' => env('SLACK_BOT_USER_DEFAULT_CHANNEL'),
],
],
];

View File

@ -0,0 +1,217 @@
<?php
use Illuminate\Support\Str;
return [
/*
|--------------------------------------------------------------------------
| Default Session Driver
|--------------------------------------------------------------------------
|
| This option determines the default session driver that is utilized for
| incoming requests. Laravel supports a variety of storage options to
| persist session data. Database storage is a great default choice.
|
| Supported: "file", "cookie", "database", "apc",
| "memcached", "redis", "dynamodb", "array"
|
*/
'driver' => env('SESSION_DRIVER', 'database'),
/*
|--------------------------------------------------------------------------
| Session Lifetime
|--------------------------------------------------------------------------
|
| Here you may specify the number of minutes that you wish the session
| to be allowed to remain idle before it expires. If you want them
| to expire immediately when the browser is closed then you may
| indicate that via the expire_on_close configuration option.
|
*/
'lifetime' => env('SESSION_LIFETIME', 120),
'expire_on_close' => env('SESSION_EXPIRE_ON_CLOSE', false),
/*
|--------------------------------------------------------------------------
| Session Encryption
|--------------------------------------------------------------------------
|
| This option allows you to easily specify that all of your session data
| should be encrypted before it's stored. All encryption is performed
| automatically by Laravel and you may use the session like normal.
|
*/
'encrypt' => env('SESSION_ENCRYPT', false),
/*
|--------------------------------------------------------------------------
| Session File Location
|--------------------------------------------------------------------------
|
| When utilizing the "file" session driver, the session files are placed
| on disk. The default storage location is defined here; however, you
| are free to provide another location where they should be stored.
|
*/
'files' => storage_path('framework/sessions'),
/*
|--------------------------------------------------------------------------
| Session Database Connection
|--------------------------------------------------------------------------
|
| When using the "database" or "redis" session drivers, you may specify a
| connection that should be used to manage these sessions. This should
| correspond to a connection in your database configuration options.
|
*/
'connection' => env('SESSION_CONNECTION'),
/*
|--------------------------------------------------------------------------
| Session Database Table
|--------------------------------------------------------------------------
|
| When using the "database" session driver, you may specify the table to
| be used to store sessions. Of course, a sensible default is defined
| for you; however, you're welcome to change this to another table.
|
*/
'table' => env('SESSION_TABLE', 'sessions'),
/*
|--------------------------------------------------------------------------
| Session Cache Store
|--------------------------------------------------------------------------
|
| When using one of the framework's cache driven session backends, you may
| define the cache store which should be used to store the session data
| between requests. This must match one of your defined cache stores.
|
| Affects: "apc", "dynamodb", "memcached", "redis"
|
*/
'store' => env('SESSION_STORE'),
/*
|--------------------------------------------------------------------------
| Session Sweeping Lottery
|--------------------------------------------------------------------------
|
| Some session drivers must manually sweep their storage location to get
| rid of old sessions from storage. Here are the chances that it will
| happen on a given request. By default, the odds are 2 out of 100.
|
*/
'lottery' => [2, 100],
/*
|--------------------------------------------------------------------------
| Session Cookie Name
|--------------------------------------------------------------------------
|
| Here you may change the name of the session cookie that is created by
| the framework. Typically, you should not need to change this value
| since doing so does not grant a meaningful security improvement.
|
*/
'cookie' => env(
'SESSION_COOKIE',
Str::slug(env('APP_NAME', 'laravel'), '_').'_session'
),
/*
|--------------------------------------------------------------------------
| Session Cookie Path
|--------------------------------------------------------------------------
|
| The session cookie path determines the path for which the cookie will
| be regarded as available. Typically, this will be the root path of
| your application, but you're free to change this when necessary.
|
*/
'path' => env('SESSION_PATH', '/'),
/*
|--------------------------------------------------------------------------
| Session Cookie Domain
|--------------------------------------------------------------------------
|
| This value determines the domain and subdomains the session cookie is
| available to. By default, the cookie will be available to the root
| domain and all subdomains. Typically, this shouldn't be changed.
|
*/
'domain' => env('SESSION_DOMAIN'),
/*
|--------------------------------------------------------------------------
| HTTPS Only Cookies
|--------------------------------------------------------------------------
|
| By setting this option to true, session cookies will only be sent back
| to the server if the browser has a HTTPS connection. This will keep
| the cookie from being sent to you when it can't be done securely.
|
*/
'secure' => env('SESSION_SECURE_COOKIE'),
/*
|--------------------------------------------------------------------------
| HTTP Access Only
|--------------------------------------------------------------------------
|
| Setting this value to true will prevent JavaScript from accessing the
| value of the cookie and the cookie will only be accessible through
| the HTTP protocol. It's unlikely you should disable this option.
|
*/
'http_only' => env('SESSION_HTTP_ONLY', true),
/*
|--------------------------------------------------------------------------
| Same-Site Cookies
|--------------------------------------------------------------------------
|
| This option determines how your cookies behave when cross-site requests
| take place, and can be used to mitigate CSRF attacks. By default, we
| will set this value to "lax" to permit secure cross-site requests.
|
| See: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Set-Cookie#samesitesamesite-value
|
| Supported: "lax", "strict", "none", null
|
*/
'same_site' => env('SESSION_SAME_SITE', 'lax'),
/*
|--------------------------------------------------------------------------
| Partitioned Cookies
|--------------------------------------------------------------------------
|
| Setting this value to true will tie the cookie to the top-level site for
| a cross-site context. Partitioned cookies are accepted by the browser
| when flagged "secure" and the Same-Site attribute is set to "none".
|
*/
'partitioned' => env('SESSION_PARTITIONED_COOKIE', false),
];

1
dashboard/database/.gitignore vendored Normal file
View File

@ -0,0 +1 @@
*.sqlite*

View File

@ -0,0 +1,44 @@
<?php
namespace Database\Factories;
use Illuminate\Database\Eloquent\Factories\Factory;
use Illuminate\Support\Facades\Hash;
use Illuminate\Support\Str;
/**
* @extends \Illuminate\Database\Eloquent\Factories\Factory<\App\Models\User>
*/
class UserFactory extends Factory
{
/**
* The current password being used by the factory.
*/
protected static ?string $password;
/**
* Define the model's default state.
*
* @return array<string, mixed>
*/
public function definition(): array
{
return [
'name' => fake()->name(),
'email' => fake()->unique()->safeEmail(),
'email_verified_at' => now(),
'password' => static::$password ??= Hash::make('password'),
'remember_token' => Str::random(10),
];
}
/**
* Indicate that the model's email address should be unverified.
*/
public function unverified(): static
{
return $this->state(fn (array $attributes) => [
'email_verified_at' => null,
]);
}
}

View File

@ -0,0 +1,49 @@
<?php
use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Support\Facades\Schema;
return new class extends Migration
{
/**
* Run the migrations.
*/
public function up(): void
{
Schema::create('users', function (Blueprint $table) {
$table->id();
$table->string('name');
$table->string('email')->unique();
$table->timestamp('email_verified_at')->nullable();
$table->string('password');
$table->rememberToken();
$table->timestamps();
});
Schema::create('password_reset_tokens', function (Blueprint $table) {
$table->string('email')->primary();
$table->string('token');
$table->timestamp('created_at')->nullable();
});
Schema::create('sessions', function (Blueprint $table) {
$table->string('id')->primary();
$table->foreignId('user_id')->nullable()->index();
$table->string('ip_address', 45)->nullable();
$table->text('user_agent')->nullable();
$table->longText('payload');
$table->integer('last_activity')->index();
});
}
/**
* Reverse the migrations.
*/
public function down(): void
{
Schema::dropIfExists('users');
Schema::dropIfExists('password_reset_tokens');
Schema::dropIfExists('sessions');
}
};

View File

@ -0,0 +1,35 @@
<?php
use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Support\Facades\Schema;
return new class extends Migration
{
/**
* Run the migrations.
*/
public function up(): void
{
Schema::create('cache', function (Blueprint $table) {
$table->string('key')->primary();
$table->mediumText('value');
$table->integer('expiration');
});
Schema::create('cache_locks', function (Blueprint $table) {
$table->string('key')->primary();
$table->string('owner');
$table->integer('expiration');
});
}
/**
* Reverse the migrations.
*/
public function down(): void
{
Schema::dropIfExists('cache');
Schema::dropIfExists('cache_locks');
}
};

View File

@ -0,0 +1,57 @@
<?php
use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Support\Facades\Schema;
return new class extends Migration
{
/**
* Run the migrations.
*/
public function up(): void
{
Schema::create('jobs', function (Blueprint $table) {
$table->id();
$table->string('queue')->index();
$table->longText('payload');
$table->unsignedTinyInteger('attempts');
$table->unsignedInteger('reserved_at')->nullable();
$table->unsignedInteger('available_at');
$table->unsignedInteger('created_at');
});
Schema::create('job_batches', function (Blueprint $table) {
$table->string('id')->primary();
$table->string('name');
$table->integer('total_jobs');
$table->integer('pending_jobs');
$table->integer('failed_jobs');
$table->longText('failed_job_ids');
$table->mediumText('options')->nullable();
$table->integer('cancelled_at')->nullable();
$table->integer('created_at');
$table->integer('finished_at')->nullable();
});
Schema::create('failed_jobs', function (Blueprint $table) {
$table->id();
$table->string('uuid')->unique();
$table->text('connection');
$table->text('queue');
$table->longText('payload');
$table->longText('exception');
$table->timestamp('failed_at')->useCurrent();
});
}
/**
* Reverse the migrations.
*/
public function down(): void
{
Schema::dropIfExists('jobs');
Schema::dropIfExists('job_batches');
Schema::dropIfExists('failed_jobs');
}
};

View File

@ -0,0 +1,23 @@
<?php
namespace Database\Seeders;
use App\Models\User;
// use Illuminate\Database\Console\Seeds\WithoutModelEvents;
use Illuminate\Database\Seeder;
class DatabaseSeeder extends Seeder
{
/**
* Seed the application's database.
*/
public function run(): void
{
// User::factory(10)->create();
User::factory()->create([
'name' => 'Test User',
'email' => 'test@example.com',
]);
}
}

2977
dashboard/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

23
dashboard/package.json Normal file
View File

@ -0,0 +1,23 @@
{
"private": true,
"type": "module",
"scripts": {
"build": "vite build",
"dev": "vite"
},
"devDependencies": {
"autoprefixer": "^10.4.20",
"axios": "^1.7.4",
"concurrently": "^9.0.1",
"laravel-vite-plugin": "^1.0",
"postcss": "^8.4.47",
"tailwindcss": "^3.4.13",
"vite": "^5.0"
},
"dependencies": {
"@patternfly/patternfly": "^6.0.0",
"@picocss/pico": "^2.0.6",
"echarts": "^5.5.1",
"leaflet": "^1.9.4"
}
}

33
dashboard/phpunit.xml Normal file
View File

@ -0,0 +1,33 @@
<?xml version="1.0" encoding="UTF-8"?>
<phpunit xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="vendor/phpunit/phpunit/phpunit.xsd"
bootstrap="vendor/autoload.php"
colors="true"
>
<testsuites>
<testsuite name="Unit">
<directory>tests/Unit</directory>
</testsuite>
<testsuite name="Feature">
<directory>tests/Feature</directory>
</testsuite>
</testsuites>
<source>
<include>
<directory>app</directory>
</include>
</source>
<php>
<env name="APP_ENV" value="testing"/>
<env name="APP_MAINTENANCE_DRIVER" value="file"/>
<env name="BCRYPT_ROUNDS" value="4"/>
<env name="CACHE_STORE" value="array"/>
<!-- <env name="DB_CONNECTION" value="sqlite"/> -->
<!-- <env name="DB_DATABASE" value=":memory:"/> -->
<env name="MAIL_MAILER" value="array"/>
<env name="PULSE_ENABLED" value="false"/>
<env name="QUEUE_CONNECTION" value="sync"/>
<env name="SESSION_DRIVER" value="array"/>
<env name="TELESCOPE_ENABLED" value="false"/>
</php>
</phpunit>

View File

@ -0,0 +1,6 @@
export default {
plugins: {
tailwindcss: {},
autoprefixer: {},
},
};

View File

@ -0,0 +1,21 @@
<IfModule mod_rewrite.c>
<IfModule mod_negotiation.c>
Options -MultiViews -Indexes
</IfModule>
RewriteEngine On
# Handle Authorization Header
RewriteCond %{HTTP:Authorization} .
RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}]
# Redirect Trailing Slashes If Not A Folder...
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_URI} (.+)/$
RewriteRule ^ %1 [L,R=301]
# Send Requests To Front Controller...
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^ index.php [L]
</IfModule>

View File

@ -0,0 +1,17 @@
<?php
use Illuminate\Http\Request;
define('LARAVEL_START', microtime(true));
// Determine if the application is in maintenance mode...
if (file_exists($maintenance = __DIR__.'/../storage/framework/maintenance.php')) {
require $maintenance;
}
// Register the Composer autoloader...
require __DIR__.'/../vendor/autoload.php';
// Bootstrap Laravel and handle the request...
(require_once __DIR__.'/../bootstrap/app.php')
->handleRequest(Request::capture());

View File

@ -0,0 +1,2 @@
User-agent: *
Disallow:

View File

@ -0,0 +1,330 @@
/* 1. Use a more-intuitive box-sizing model */
*, *::before, *::after {
box-sizing: border-box;
}
/* 2. Remove default margin */
* {
margin: 0;
font-family: sans-serif;
}
body {
/* 3. Add accessible line-height */
line-height: 1.5;
/* 4. Improve text rendering */
-webkit-font-smoothing: antialiased;
padding: 0 1em;
height: 100vh;
background-image: radial-gradient(73% 147%, #EADFDF 59%, #ECE2DF 100%), radial-gradient(91% 146%, rgba(255,255,255,0.50) 47%, rgba(0,0,0,0.50) 100%);
background-blend-mode: screen;
}
/* 5. Improve media defaults */
img, picture, video, canvas, svg {
display: block;
max-width: 100%;
}
/* 6. Inherit fonts for form controls */
input, button, textarea, select {
font: inherit;
}
/* 7. Avoid text overflows */
p, h1, h2, h3, h4, h5, h6 {
overflow-wrap: break-word;
}
/* 8. Improve line wrapping */
p {
text-wrap: pretty;
}
h1, h2, h3, h4, h5, h6 {
text-wrap: balance;
}
dt{
font-weight: 600;
}
dd + dt{
margin-top: .2em;
}
nav + button,
span + button{
margin-left: .5em;
}
ul{
padding-left: 1em;
}
p + ul{
margin-top: 1em;
}
button[popovertarget]{
background: no-repeat center / .3em #4d004b url("data:image/svg+xml,%3Csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 192 512'%3E%3C!--!Font Awesome Free 6.7.2 by @fontawesome - https://fontawesome.com License - https://fontawesome.com/license/free Copyright 2025 Fonticons, Inc.--%3E%3Cpath fill='%23fff' d='M48 80a48 48 0 1 1 96 0A48 48 0 1 1 48 80zM0 224c0-17.7 14.3-32 32-32l64 0c17.7 0 32 14.3 32 32l0 224 32 0c17.7 0 32 14.3 32 32s-14.3 32-32 32L32 512c-17.7 0-32-14.3-32-32s14.3-32 32-32l32 0 0-192-32 0c-17.7 0-32-14.3-32-32z'/%3E%3C/svg%3E%0A");
cursor: pointer;
display: inline-block;
width: 1.5em;
height: 1.5em;
border-radius: 50%;
border: 1px solid #fff;
}
button[popovertarget]::before{
color: #fff;
font-weight: 700;
}
button[popovertarget]>span{
position: absolute;
left: -999em;
top: -999em;
}
[popover] {
border: none;
border-radius: 1em;
background: #fff;
padding: 1.5em;
border-radius: var(--small-border);
box-shadow: .0625em .0625em .625em rgba(0, 0, 0, 0.1);
max-width: 40em;
top: 4em;
margin: 0 auto;
}
[popover]::backdrop{
background-color: rgba(0,0,0,.5);
}
[popover] h2{
font-size: 1em;
}
[popover] h3{
font-size: .95em;
margin-top: 1em;
}
p.formula{
font-family: monospace;
border: 1px solid #aaa;
padding: .5em 1em;
}
p + p{
margin-top: 1em;
}
/*
9. Create a root stacking context
*/
#root, #__next {
isolation: isolate;
}
body>header{
position: fixed;
top: 0;
left: 0;
width: 100%;
height: 3em;
background: #ccc;
z-index: 99;
display: flex;
align-items: center;
padding: 0 1em;
}
body>header>nav{
text-align: center;
min-width: 10em;
background: #fff;
border-radius: .2em;
position: relative;
border: 1px solid #fff;
}
body>header>nav>ul{
position: absolute;
background: #fff;
width: calc(100% + 2px);
list-style: none;
padding: 0 0 1em;
top: -999em;
left: -999em;
border-radius: 0 0 .2em .2em;
border-left: 1px solid #aaa;
border-right: 1px solid #aaa;
border-bottom: 1px solid #aaa;
}
body>header>nav:hover{
border-radius: .2em .2em 0 0;
border: 1px solid #aaa;
}
body>header>nav:hover ul{
top: initial;
left: -1px;
}
body>header>nav>ul>li a,
body>header>nav>strong{
display: inline-block;
padding: .2em .4em;
}
a{
color: #000;
}
a:hover,
a:focus{
color: #aaa;
}
main{
width: 100%;
height: 100vh;
padding: 4em 0 1em;
display: grid;
gap: .5em;
}
body.overview main{
grid-template-columns: repeat(8, minmax(1%, 50%));
grid-template-rows: repeat(4, 1fr);
grid-template-areas:
"chart1 chart1 chart1 chart2 chart2 chart2 chart4 chart4"
"chart1 chart1 chart1 chart2 chart2 chart2 chart4 chart4"
"chart1 chart1 chart1 chart3 chart3 chart3 chart4 chart4"
"chart1 chart1 chart1 chart3 chart3 chart3 chart4 chart4"
}
body.region main{
grid-template-columns: repeat(4, minmax(10%, 50%));
grid-template-rows: repeat(6, 1fr) 4em;
grid-template-areas:
"chart1 chart1 chart2 chart2"
"chart1 chart1 chart2 chart2"
"chart1 chart1 chart3 chart4"
"chart1 chart1 chart3 chart4"
"chart1 chart1 chart6 chart6"
"chart1 chart1 chart6 chart6"
"chart1 chart1 timeline timeline";
}
body.property main{
grid-template-columns: repeat(4, minmax(10%, 50%));
grid-template-rows: repeat(4, 1fr) 4em;
grid-template-areas:
"chart1 chart1 chart2 chart2"
"chart1 chart1 chart2 chart2"
"chart5 chart5 chart3 chart4"
"chart5 chart5 chart3 chart4"
"chart5 chart5 timeline timeline";
}
article{
background: #f9f9f9;
border: .0625em solid #ccc;
box-shadow: 0 5px 10px rgba(154,160,185,.05), 0 15px 40px rgba(166,173,201,.2);
border-radius: .2em;
display: grid;
}
article.header{
grid-template-columns: 100%;
grid-template-rows: minmax(1%, 2em) 1fr;
padding: .5em 1em 1em .5em;
}
article.map{
padding: 0;
}
article.map>header{
padding: .5em 1em 1em .5em;
}
article>header{
display: grid;
grid-template-columns: 1fr 1em;
grid-template-rows: 1fr;
}
article>header>h2{
font-size: .8em;
font-weight: 600;
}
@media(max-width: 960px){
body{
height: auto;
}
body.overview main,
body.region main,
body.property main{
height: auto;
grid-template-columns: 100%;
grid-template-rows: repeat(5, minmax(20em, 25em)) 4em;
grid-template-areas: "chart1" "chart2" "chart3" "chart4" "chart5" "chart6" "timeline";
}
body.overview main{
grid-template-rows: minmax(20em, 40em) repeat(3, minmax(20em, 25em));
grid-template-areas: "chart1" "chart2" "chart3" "chart4";
}
body.region main{
grid-template-rows: minmax(20em, 40em) repeat(4, minmax(20em, 25em)) 4em;
grid-template-areas: "chart1" "chart2" "chart3" "chart4" "chart6" "timeline";
}
body.property main{
grid-template-rows: repeat(5, minmax(20em, 25em)) 4em;
grid-template-areas: "chart1" "chart2" "chart3" "chart4" "chart5" "timeline";
}
}
.leaflet-marker-icon span{
background: #4d004b;
width: 2rem;
height: 2rem;
display: block;
left: -1rem;
top: -1rem;
position: relative;
border-radius: 50% 50% 0;
transform: rotate(45deg);
border: 2px solid #fff
}
/*['#9ecae1','#6baed6','#4292c6','#2171b5','#084594'*/
.leaflet-marker-icon.region1 span{
background: #8c96c6;
}
.leaflet-marker-icon.region2 span{
background: #88419d;
}
.leaflet-marker-icon.region3 span{
background: #810f7c;
}
.leaflet-marker-icon.region4 span{
background: #4d004b;
}

4
dashboard/resources/css/pico.min.css vendored Normal file

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,4 @@
import * as echarts from 'echarts';
import 'leaflet'
window.echarts = echarts;

View File

@ -0,0 +1,4 @@
import axios from 'axios';
window.axios = axios;
window.axios.defaults.headers.common['X-Requested-With'] = 'XMLHttpRequest';

View File

@ -0,0 +1,17 @@
<!DOCTYPE html>
<html lang="de">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Dashboard</title>
@vite(['resources/css/app.css', 'resources/js/app.js', 'node_modules/leaflet/dist/leaflet.css'])
</head>
<body class="@yield('body-class')">
<header>
@yield('header')
</header>
<main>
@yield('main')
</main>
</body>
</html>

View File

@ -0,0 +1,365 @@
@extends('base')
@section('body-class', 'overview')
@section('header')
<nav>
<strong>Start</strong>
<ul>
@foreach($regions as $r)
<li><a href="/region/{{ $r['id'] }}">{{ $r['name'] }}</a></li>
@endforeach
</ul>
</nav>
@endsection
@section('main')
<article class="header" style="grid-area: chart1;">
<header>
<h2>Verfügbarkeit aller Mietobjekte über gesamten beobachteten Zeitraum</h2>
<button popovertarget="pop1">
<span>Erklärungen zum Diagramm</span>
</button>
<div popover id="pop1">
<h2>Verfügbarkeit aller Mietobjekte über gesamten beobachteten Zeitraum</h2>
<p>
Das Diagramm zeigt die Verfügbarkeit aller Mietobjekte zu allen beobachteten Zeitpunkten.
</p>
<ul>
<li>X-Achse: Zeitpunkt Beobachtung.</li>
<li>Y-Achse: Mietobjekte.</li>
<li>Kategorien: 0% = Das Mietobjekt ist komplett Ausgebucht; 100% = Das Mietobjekt kann zu allen Verfügbaren Daten gebucht werden.</li>
</ul>
<h3>Berrechnung Verfügbarkeit</h3>
<p>Die Verfügbarkeit eines Mietobjekt errechnet sich folgendermassen:</p>
<p class="formula">
Verfügbarkeit = (100 / (Anzahl Buchungsdaten * 2)) * Summe Status
</p>
<ul>
<li>Status: Jeder verfügbare Kalendertag kann den Status «Nicht Verfügbar» (0), «Verfügbar (kein Anreisetag)» (1) oder «Verfügbar» (2) aufweisen.</li>
<li>Anzahl Buchungsdaten: Die Summe aller angebotenen Buchungsdaten mit zwei multipliziert (= Alle Buchungdaten haben den Status «Verfügbar»)</li>
</ul>
</div>
<div>
</header>
<div id="chart-heatmap"></div>
</article>
<article class="header" style="grid-area: chart2;">
<header>
<h2>
Anzahl jemals gefundene Kurzzeitmietobjekte pro Region
</h2>
<button popovertarget="pop2">
<span>Erklärungen zum Diagramm</span>
</button>
<div popover id="pop2">
<h2>Anzahl jemals gefundener Mietobjekte pro Region</h2>
<p>
Das Balkendiagramm zeigt die Anzahl jemals gefundener Mietobjekte pro Region.
</p>
<ul>
<li>X-Achse: Region</li>
<li>Y-Achse: Anzahl Mietobjekte</li>
</ul>
</div>
<div>
</header>
<div id="chart-props-per-region"></div>
</article>
<article class="header" style="grid-area: chart3;">
<header>
<h2>
Entwicklung der Anzahl jemals gefunden Kurzzeitmietobjekte
</h2>
<button popovertarget="pop3">
<span>Erklärungen zum Diagramm</span>
</button>
<div popover id="pop3">
<h2>Entwicklung Anzahl jemals gefundener Mietobjekte pro Region</h2>
<p>
Das Liniendiagramm zeigt die Entwicklung aller jemals gefundener Mietobjekte pro Region.
</p>
<ul>
<li>X-Achse: Zeitpunkt Beobachtung</li>
<li>Y-Achse: Anzahl Mietobjekte</li>
</ul>
</div>
<div>
</header>
<div id="extractions"></div>
</article>
<article style="grid-area: chart4;">
<div id="leaflet"></div>
</article>
<script type="module">
const sharedOptions = {
basic: {
color: {!! $diagramsOptions['shared']['colors'] !!},
grid: {
top: 30,
left: 70,
right: 0,
bottom: 45
},
name: (opt) => {
return {
name: opt.name,
nameLocation: opt.location,
nameGap: 50,
nameTextStyle: {
fontWeight: 'bold',
},
}
}
}
}
const extractionDates = {!! $diagramsOptions['shared']['extractionDates'] !!};
const chartHeatmap = document.getElementById('chart-heatmap');
const cHeatmap = echarts.init(chartHeatmap);
const cHeatmapOptions = {
animation: false,
tooltip: {
position: 'top'
},
grid: {
show: true,
borderWidth: 1,
borderColor: '#aaa',
top: 30,
right: 45,
bottom: 70,
left: 30
},
dataZoom: [{
type: 'slider'
},
{
type: 'slider',
show: true,
yAxisIndex: 0,
}],
xAxis: {
show: true,
name: 'Zeitpunkt Beobachtung',
type: 'category',
data: extractionDates,
splitArea: {
show: false
},
splitArea: {
show: false
},
axisLabel: {
show: false,
},
axisTick: {
show: false,
},
axisLine: {
show: false,
},
nameLocation: 'center',
nameGap: 10,
nameTextStyle: {
fontWeight: 'bold',
}
},
yAxis: {
show: true,
type: 'category',
data: {!! $diagramsOptions['heatmap']['yAxis']['data'] !!},
splitArea: {
show: false
},
axisTick: {
show: false,
},
axisLine: {
show: false,
},
axisLabel: {
show: false,
},
name: 'Mietobjekte',
nameLocation: 'center',
nameGap: 10,
nameTextStyle: {
fontWeight: 'bold',
}
},
visualMap: {
type: 'piecewise',
min: 0,
max: 100,
calculable: true,
orient: 'horizontal',
left: 'center',
top: 0,
formatter: (v1, v2) => {
return `${v1}${v2}%`;
},
inRange: {
color: sharedOptions.basic.color,
},
},
series: [
{
name: 'Verfügbarkeit',
type: 'heatmap',
blurSize: 0,
data: {!! $diagramsOptions['heatmap']['series']['data'] !!},
label: {
show: false
},
tooltip: {
formatter: (data) => {
return `Kurzzeitmietobjekte-ID: ${data.data[1]}<br />Beobachtungszeitpunkt: ${data.data[0]}<br/>Verfügbarkeit: ${data.data[2].toFixed(2)}%`
},
},
emphasis: {
itemStyle: {
borderColor: '#000',
borderWidth: 2
}
}
}
]
}
cHeatmap.setOption(cHeatmapOptions);
const chartPropsPerRegion = document.getElementById('chart-props-per-region');
const cPropsPerRegion = echarts.init(chartPropsPerRegion);
const cPropsPerRegionOptions = {
grid: sharedOptions.basic.grid,
color: sharedOptions.basic.color,
xAxis: {
name: 'Region',
nameLocation: 'center',
nameGap: 30,
nameTextStyle: {
fontWeight: 'bold',
},
type: 'category',
data: {!! $diagramsOptions['propertiesPerRegion']['xAxis']['data'] !!}
},
yAxis: {
type: 'value',
name: 'Anzahl Mietobjekte',
nameLocation: 'middle',
nameGap: 50,
nameTextStyle: {
fontWeight: 'bold',
},
},
series: [
{
data: {!! $diagramsOptions['propertiesPerRegion']['yAxis']['data'] !!},
type: 'bar',
itemStyle: {
color: (e) => {
return sharedOptions.basic.color[e.dataIndex];
}
}
},
]
};
cPropsPerRegion.setOption(cPropsPerRegionOptions);
const chartExtractions = document.getElementById('extractions');
const cExtractions = echarts.init(chartExtractions);
const cExtractionsOptions = {
color: sharedOptions.basic.color,
tooltip: {
trigger: 'axis'
},
legend: {
show: true
},
grid: sharedOptions.basic.grid,
xAxis: {
name: 'Zeitpunkt Beobachtung',
nameLocation: 'center',
nameGap: 24,
nameTextStyle: {
fontWeight: 'bold',
},
type: 'category',
boundaryGap: false,
data: extractionDates
},
yAxis: {
name: 'Anzahl Mietobjekte',
nameLocation: 'center',
nameGap: 50,
nameTextStyle: {
fontWeight: 'bold',
},
type: 'value'
},
series: [
{
name: 'Alle',
type: 'line',
stack: 'Total',
data: {!! json_encode($diagramsOptions['extractions']['series']['total_all']) !!},
},
{
connectNulls: true,
name: 'Davos',
type: 'line',
data: {!! json_encode($diagramsOptions['extractions']['series']['total_davos']) !!}
},
{
connectNulls: true,
name: 'Engadin',
type: 'line',
data: {!! json_encode($diagramsOptions['extractions']['series']['total_engadin']) !!}
},
{
connectNulls: true,
name: 'Heidiland',
type: 'line',
data: {!! json_encode($diagramsOptions['extractions']['series']['total_heidiland']) !!}
},
{
connectNulls: true,
name: 'St. Moritz',
type: 'line',
data: {!! json_encode($diagramsOptions['extractions']['series']['total_stmoritz']) !!}
},
]
};
cExtractions.setOption(cExtractionsOptions);
const map = L.map('leaflet');
L.tileLayer('https://tile.openstreetmap.org/{z}/{x}/{y}.png', {
maxZoom: 19,
attribution: '&copy; <a href="http://www.openstreetmap.org/copyright">OpenStreetMap</a>'
}).addTo(map);
function icon(id){
return L.divIcon({
className: "region"+id,
html: '<span></span>'
})
}
const markers = L.featureGroup([
@foreach($geo as $g)
L.marker([{{ $g['latlng'] }}], {icon: icon({{ $g['region_id'] }})}).bindPopup('<a href="/property/{{ $g['property_id'] }}">{{ $g['latlng'] }}</a>'),
@endforeach
]).addTo(map);
map.fitBounds(markers.getBounds(), {padding: [20,20]})
cHeatmap.on('click', 'series', (e) => {
window.open(`/property/${e.value[1]}?date=${e.value[0]}`, '_self');
})
</script>
@endsection

View File

@ -0,0 +1,27 @@
@extends('base')
@section('body-class', 'property')
@section('header')
<nav>
<strong>Property: {{ $base['check_data'] }} ({{ $base['region_name'] }})</strong>
<ul>
<li><a href="/">Start</a></li>
@foreach($regions as $r)
<li><a href="/region/{{ $r['id'] }}">{{ $r['name'] }}</a></li>
@endforeach
</ul>
</nav>
<button popovertarget="prop-details"></button>
<div popover id="prop-details">
<dl>
<dt>Region</dt>
<dd>{{ $base['region_name'] }}</dd>
<dt>Zum ersten mal gefunden</dt>
<dd>{{ $base['first_found'] }}</dd>
<dt>Zum letzten mal gefunden</dt>
<dd>{{ $base['last_found'] }}</dd>
</dl>
</div>
@endsection
@section('main')
<p>Für dieses Mietobjekt sind keine Daten vorhanden.</p>
@endsection

View File

@ -0,0 +1,463 @@
@extends('base')
@section('body-class', 'property')
@section('header')
<nav>
<strong>Mietobjekt: {{ $base['latlng'] }} ({{ $base['region_name'] }})</strong>
<ul>
<li><a href="/">Start</a></li>
@foreach($regions as $r)
<li><a href="/region/{{ $r['id'] }}">{{ $r['name'] }}</a></li>
@endforeach
</ul>
</nav>
<button popovertarget="prop-details"></button>
<div popover id="prop-details">
<dl>
<dt>Region</dt>
<dd>{{ $base['region_name'] }}</dd>
<dt>Zum ersten mal gefunden</dt>
<dd>{{ $base['first_found'] }}</dd>
<dt>Zum letzten mal gefunden</dt>
<dd>{{ $base['last_found'] }}</dd>
</dl>
</div>
@endsection
@section('main')
<article style="grid-area: timeline;">
<div id="timeline"></div>
</article>
<article class="header" style="grid-area: chart2;">
<header>
<h2 id="belegung-title">
Kalenderansicht der Verfügbarkeit am <span class="date">{{ $startDate }}</span>
</h2><button popovertarget="popup-cal"></button>
<div popover id="popup-cal">
<p>
Das Kalenderdiagram zeigt die drei Verfügbarkeitskategorien des Mietobjekts.
</p>
</div>
</header>
<div id="chart-calendar"></div>
</article>
<article class="header map" style="grid-area: chart5;">
<header>
<h2 id="belegung-title">
Kurzzeitmietobjekte in der Nähe
</h2>
</header>
<div id="chart-map"></div>
</article>
<article class="header" style="grid-area: chart3;">
<header>
<h2>
Verfügbarkeit Mietobjekt Monate am <span class="date">{{ $startDate }}</span>
</h2>
</header>
<div id="chart-capacity-monthly">
</div>
</article>
<article class="header" style="grid-area: chart1;">
<header>
<h2>
Entwicklung der Verfügbarkeit
</h2>
<button popovertarget="chart-capacity-popover"></button>
<div id="chart-capacity-popover" popover>
<h2>Erkläung zum Diagramm</h2>
<p>Das Liniendiagramm zeigt, wie sich die insgesamte Verfügbarkeit des Kurzzeitmietobjekts entwickelt hat.</p>
</div>
</header>
<div id="chart-capacity"></div>
</article>
<article class="header" style="grid-area: chart4;">
<header>
<h2>
Verfügbarkeit Mietobjekt Tage am <span class="date">{{ $startDate }}</span>
</h2>
</header>
<div id="chart-capacity-daily">
</article>
<script type="module">
const sharedOptions = {
extractiondates: {!! $diagramsOptions['shared']['extractiondates']!!},
basic: {
color: {!!$diagramsOptions['shared']['colors']!!},
grid: {
top: 20,
left: 60,
right: 0,
bottom: 50
},
tooltip: {
show: true,
trigger: 'axis',
valueFormatter: (value) => value.toFixed(2) + '%'
},
name: (opt) => {
return {
name: opt.name,
nameLocation: opt.location,
nameGap: 24,
nameTextStyle: {
fontWeight: 'bold',
}
}
}
}
}
const chartTimeline = document.getElementById('timeline');
const cTimeline = echarts.init(chartTimeline);
const cTimelineOptions = {
grid: {
show: false,
},
timeline: {
data: sharedOptions.extractiondates,
playInterval: 1000,
axisType: 'time',
left: 8,
right: 8,
bottom: 0,
label: {
show: false
}
},
};
cTimeline.setOption(cTimelineOptions);
const chartCapacityMonthly = document.getElementById('chart-capacity-monthly');
const cCapacityMonthly = echarts.init(chartCapacityMonthly);
const cCapacityMonthlyOptions = {
tooltip: sharedOptions.basic.tooltip,
timeline: {
show: false,
data: sharedOptions.extractiondates,
axisType: 'time',
},
grid: {
top: 5,
bottom: 40,
left: 70,
right: 10
},
xAxis: {
type: 'value',
max: 100,
name: 'Verfügbarkeit in %',
nameLocation: 'center',
nameGap: 25,
nameTextStyle: {
fontWeight: 'bold',
}
},
yAxis: {
type: 'category',
},
options: [
@foreach ($diagramsOptions['capacityMonthly']['options'] as $cM)
{
yAxis: {
data: {!! json_encode($cM['months']) !!}
},
series: [{
type: 'bar',
itemStyle: {
color: sharedOptions.basic.color[3]
},
data: {!! json_encode($cM['capacities']) !!}
}]
},
@endforeach
]
};
cCapacityMonthly.setOption(cCapacityMonthlyOptions);
const chartCapacityDaily = document.getElementById('chart-capacity-daily');
const cCapacityDaily = echarts.init(chartCapacityDaily);
const cCapacityDailyOptions = {
tooltip: sharedOptions.basic.tooltip,
timeline: {
show: false,
data: sharedOptions.extractiondates,
axisType: 'time',
},
grid: {
top: 5,
bottom: 40,
left: 70,
right: 10
},
xAxis: {
type: 'value',
max: 100,
name: 'Verfügbarkeit in %',
nameLocation: 'center',
nameGap: 25,
nameTextStyle: {
fontWeight: 'bold',
}
},
yAxis: {
type: 'category',
},
options: [
@foreach ($diagramsOptions['capacityDaily']['options'] as $cD)
{
yAxis: {
data: {!! json_encode($cD['weekdays']) !!}
},
series: [{
type: 'bar',
itemStyle: {
color: sharedOptions.basic.color[3]
},
data: {!! json_encode($cD['capacities']) !!}
}]
},
@endforeach
]
};
cCapacityDaily.setOption(cCapacityDailyOptions);
const chartCapacity = document.getElementById('chart-capacity');
const cCapacity = echarts.init(chartCapacity);
const cCapacityOptions = {
color: sharedOptions.basic.color,
legend: {
show: true
},
tooltip: {
trigger: 'axis',
valueFormatter: (value) => value.toFixed(2)+'%'
},
grid: {
top: 40,
left: 25,
right: 10,
bottom: 20,
containLabel: true
},
xAxis: {
type: 'category',
boundaryGap: false,
data: {!! $diagramsOptions['capacities']['xAxis']['data'] !!},
name: 'Zeitpunkt Beobachtung',
nameLocation: 'center',
nameGap: 24,
nameTextStyle: {
fontWeight: 'bold',
}
},
yAxis: {
type: 'value',
min: 0,
max: 100,
name: 'Verfügbarkeit in %',
nameLocation: 'center',
nameGap: 38,
nameTextStyle: {
fontWeight: 'bold',
}
},
series: [
{
name: 'Verfügbarkeit Mietobjekt',
type: 'line',
symbolSize: 7,
data: {!! $diagramsOptions['capacities']["series"][0]["data"] !!}
},
{
name: 'Verfügbarkeit {{ $base['region_name'] }}',
type: 'line',
symbolSize: 7,
data: {!! $diagramsOptions['capacities']["series"][1]["data"] !!}
},
{
name: 'Verfügbarkeit alle Regionen',
type: 'line',
symbolSize: 7,
data: {!! $diagramsOptions['capacities']["series"][2]["data"] !!}
}
]
};
cCapacity.setOption(cCapacityOptions);
const chartCalendar = document.getElementById('chart-calendar');
const cCalendar = echarts.init(chartCalendar);
const h2Belegung = document.getElementById('belegung-title');
const cCalendarOptions = {
timeline: {
show: false,
data: sharedOptions.extractiondates,
axisType: 'time',
},
visualMap: {
categories: [0,1,2],
inRange: {
color: ['#ca0020', '#92c5de', '#0571b0']
},
formatter: (cat) => {
switch (cat) {
case 0:
return 'Ausgebucht';
case 1:
return 'Verfügbar (kein Anreisetag)';
case 2:
return 'Verfügbar';
}
},
type: 'piecewise',
orient: 'horizontal',
left: 'center',
top: 0
},
calendar:[
{
orient: 'horizontal',
range: '2024',
top: '15%',
right: 10,
bottom: '65%',
left: 50,
dayLabel: {
fontSize: 10
}
},
{
orient: 'horizontal',
range: '2025',
top: '47%',
right: 10,
bottom: '33%',
left: 50,
dayLabel: {
fontSize: 10
}
},
{
orient: 'horizontal',
range: '2026',
top: '79%',
right: 10,
bottom: '1%',
left: 50,
dayLabel: {
fontSize: 10
}
}
],
options: [
@foreach ($diagramsOptions['calendar']['series'] as $c)
{
series: [{
type: 'heatmap',
coordinateSystem: 'calendar',
calendarIndex: 0,
data: {!! json_encode($c) !!}
},
{
type: 'heatmap',
coordinateSystem: 'calendar',
calendarIndex: 1,
data: {!! json_encode($c) !!}
},
{
type: 'heatmap',
coordinateSystem: 'calendar',
calendarIndex: 2,
data: {!! json_encode($c) !!}
}]
},
@endforeach
]
};
cCalendar.setOption(cCalendarOptions);
cTimeline.on('timelinechanged', (e) => {
let dateTitles = document.querySelectorAll('span.date');
dateTitles.forEach(el => {
el.innerText = cTimelineOptions.timeline.data[e.currentIndex];
});
// Set markpoint on linechart
let x = cCapacityOptions.xAxis.data[e.currentIndex];
let y = cCapacityOptions.series[0].data[e.currentIndex];
cCapacityMonthly.dispatchAction({
type: 'timelineChange',
currentIndex: e.currentIndex
});
cCapacityDaily.dispatchAction({
type: 'timelineChange',
currentIndex: e.currentIndex
});
cCalendar.dispatchAction({
type: 'timelineChange',
currentIndex: e.currentIndex
});
cCapacity.setOption({
series: {
markPoint: {
data: [{
coord: [x, y]
}]
}
}
});
})
/* Map w/ neighbours*/
const map = L.map('chart-map');
L.tileLayer('https://tile.openstreetmap.org/{z}/{x}/{y}.png', {
maxZoom: 19,
attribution: '&copy; <a href="http://www.openstreetmap.org/copyright">OpenStreetMap</a>'
}).addTo(map);
function icon(id = 0){
return L.divIcon({
className: "region"+id,
html: '<span></span>'
})
}
const markers = L.featureGroup([
L.marker([{{ $base['latlng'] }}], {icon: icon(1)}),
@foreach($neighbours as $n)
L.marker([{{ $n['lat'] }}, {{ $n['lon'] }}], {icon: icon()}).bindPopup('<a href="/property/{{ $n['id'] }}">{{ $n['lat'] }}, {{ $n['lon'] }}</a>'),
@endforeach
]).addTo(map);
map.fitBounds(markers.getBounds(), {padding: [20,20]})
cCapacity.on('click', 'series', (e) => {
// Switch to correct calendar in the timeline
cTimeline.dispatchAction({
type: 'timelineChange',
currentIndex: e.dataIndex
});
});
</script>
@endsection

View File

@ -0,0 +1,583 @@
@extends('base')
@section('body-class', 'region')
@section('header')
<nav>
<strong>{{ $region['name'] }}</strong>
<ul>
<li><a href="/">Start</a></li>
@foreach($regions as $r)
@if($r['id'] != $regionId)
<li><a href="/region/{{ $r['id'] }}">{{ $r['name'] }}</a></li>
@endif
@endforeach
</ul>
</nav>
@endsection
@section('main')
<article style="grid-area: timeline;">
<div id="timeline"></div>
</article>
<article class="header" style="grid-area: chart6;">
<header>
<h2 id="prediction-title">Gleitender Mittelwert für die Verfügbarkeit der Region</h2>
<button popovertarget="chart-prediction-popover"></button>
<div id="chart-prediction-popover" popover>
<h2>Gleitender Mittelwert für die Verfügbarkeit der Region</h2>
<p>Das Diagramm...</p>
<ul>
<li>X-Achse: Zeitpunkt der Beobachtung</li>
<li>Y-Achse: Verfügbarkeit einer Region. 0% = Alle Mietobjekte der Region sind komplett ausgebucht; 100% = Alle Mietobjekte der Region können zu allen verfügbaren Daten gebucht werden. </li>
</ul>
</div>
</header>
<div id="chart-prediction"></div>
</article>
<article class="header" style="grid-area: chart1;">
<header>
<h2 id="belegung-title">Verfügbarkeit aller Mietobjekte der Region über gesamten beobachteten Zeitraum</h2>
<button popovertarget="popup-heat"></button><div popover id="popup-heat">
<h2>Verfügbarkeit aller Mietobjekte der Region über gesamten beobachteten Zeitraum</h2>
<p>
Das Diagramm zeigt die Verfügbarkeit aller Mietobjekte der Region zu allen beobachteten Zeitpunkten.
</p>
<ul>
<li>X-Achse: Zeitpunkt Beobachtung.</li>
<li>Y-Achse: Mietobjekte.</li>
<li>Kategorien: 0% = Das Mietobjekt ist komplett Ausgebucht; 100% = Das Mietobjekt kann zu allen Verfügbaren Daten gebucht werden.</li>
</ul>
<h3>Berrechnung Verfügbarkeit</h3>
<p>Die Verfügbarkeit eines Mietobjekt errechnet sich folgendermassen:</p>
<p class="formula">
Verfügbarkeit = (100 / (Anzahl Buchungsdaten * 2)) * Summe Status
</p>
<ul>
<li>Status: Jeder verfügbare Kalendertag kann den Status «Nicht Verfügbar» (0), «Verfügbar (kein Anreisetag)» (1) oder «Verfügbar» (2) aufweisen.</li>
<li>Anzahl Buchungsdaten: Die Summe aller angebotenen Buchungsdaten mit zwei multipliziert (= Alle Buchungdaten haben den Status «Verfügbar»)</li>
</ul>
</div>
<div>
</header>
<div id="chart-heatmap"></div>
</article>
<article class="header" style="grid-area: chart3;">
<header>
<h2>
Verfügbarkeit nach Monat am <span class="date">{{ $startDate }}</span>
</h2>
</header>
<div id="chart-capacity-monthly">
</div>
</article>
<article class="header" style="grid-area: chart2;">
<header>
<h2>
Entwicklung der Verfügbarkeit
</h2>
<button popovertarget="chart-capacity-popover"></button>
<div id="chart-capacity-popover" popover>
<h2>Entwicklung der Verfügbarkeit</h2>
<p>Das Liniendiagramm zeigt die Entwicklung Verfügbarkeit der Region im Vergleich zu allen Regionen an.</p>
<ul>
<li>X-Achse: Zeitpunkt der Beobachtung</li>
<li>Y-Achse: Verfügbarkeit einer Region. 0% = Alle Mietobjekte der Region sind komplett ausgebucht; 100% = Alle Mietobjekte der Region können zu allen verfügbaren Daten gebucht werden. </li>
</ul>
</div>
</header>
<div id="chart-capacity"></div>
</article>
<article class="header" style="grid-area: chart4;">
<header>
<h2>
Verfügbarkeit nach Wochentage am <span class="date">{{ $startDate }}</span>
</h2>
</header>
<div id="chart-capacity-daily">
</article>
<script type="module">
const sharedOptions = {
basic: {
color: {!! $diagramsOptions['shared']['colors'] !!},
grid: {
top: 20,
left: 60,
right: 0,
bottom: 50
},
tooltip: {
show: true,
trigger: 'axis',
valueFormatter: (value) => value == null ? 'N/A' : value.toFixed(2)+'%'
},
name: (opt) => {
return {
name: opt.name,
nameLocation: opt.location,
nameGap: 24,
nameTextStyle: {
fontWeight: 'bold',
},
}
}
}
}
const chartCapacity = document.getElementById('chart-capacity');
const cCapacity = echarts.init(chartCapacity);
const cCapacityOptions = {
legend: {
show: true
},
tooltip: sharedOptions.basic.tooltip,
color: sharedOptions.basic.color,
grid: {
top: 20,
left: 25,
right: 10,
bottom: 20,
containLabel: true
},
xAxis: {
type: 'category',
boundaryGap: false,
data: {!! $diagramsOptions['capacity']['xAxis']['data'] !!},
name: 'Zeitpunkt Beobachtung',
nameLocation: 'center',
nameGap: 24,
nameTextStyle: {
fontWeight: 'bold',
}
},
yAxis: {
type: 'value',
min: 0,
max: 100,
name: 'Verfügbarkeit in %',
nameLocation: 'center',
nameGap: 38,
nameTextStyle: {
fontWeight: 'bold',
}
},
series: [{
name: 'Verfügbarkeit alle Regionen',
type: 'line',
symbolSize: 7,
data: {!! $diagramsOptions['capacity']['series']['all']['data'] !!}
},
{
name: 'Verfügbarkeit Region',
type: 'line',
symbolSize: 7,
data: {!! $diagramsOptions['capacity']['series']['region']['data'] !!}
}]
};
cCapacity.setOption(cCapacityOptions);
const chartCapacityMonthly = document.getElementById('chart-capacity-monthly');
const cCapacityMonthly = echarts.init(chartCapacityMonthly);
const cCapacityMonthlyOptions = {
timeline: {
show: false,
data: {!! $diagramsOptions['capacity']['xAxis']['data'] !!},
axisType: 'time',
},
grid: {
top: 5,
bottom: 40,
left: 70,
right: 10
},
xAxis: {
type: 'value',
max: 100,
name: 'Verfügbarkeit in %',
nameLocation: 'center',
nameGap: 25,
nameTextStyle: {
fontWeight: 'bold',
}
},
yAxis: {
type: 'category',
},
tooltip: sharedOptions.basic.tooltip,
options: [
@foreach ($diagramsOptions['capacityMonthly']['options'] as $m)
{
yAxis: {
data: {!! json_encode($m['months']) !!}
},
series: [{
type: 'bar',
itemStyle: {
color: sharedOptions.basic.color[3]
},
data: {!! json_encode($m['capacities']) !!}
}]
},
@endforeach
]
};
cCapacityMonthly.setOption(cCapacityMonthlyOptions);
const chartCapacityDaily = document.getElementById('chart-capacity-daily');
const cCapacityDaily = echarts.init(chartCapacityDaily);
const cCapacityDailyOptions = {
timeline: {
show: false,
data: {!! $diagramsOptions['capacity']['xAxis']['data'] !!},
axisType: 'time',
},
tooltip: sharedOptions.basic.tooltip,
grid: {
top: 5,
bottom: 40,
left: 70,
right: 10
},
xAxis: {
type: 'value',
max: 100,
name: 'Verfügbarkeit in %',
nameLocation: 'center',
nameGap: 25,
nameTextStyle: {
fontWeight: 'bold',
}
},
yAxis: {
type: 'category',
},
options: [
@foreach ($diagramsOptions['capacityDaily']['options'] as $d)
{
yAxis: {
data: {!! json_encode($d['weekdays']) !!}
},
series: [{
type: 'bar',
itemStyle: {
color: sharedOptions.basic.color[3]
},
data: {!! json_encode($d['capacities']) !!}
}]
},
@endforeach
]
};
cCapacityDaily.setOption(cCapacityDailyOptions);
const chartPrediction = document.getElementById('chart-prediction');
const cPrediction = echarts.init(chartPrediction);
const cPredictionOptions = {
color: [sharedOptions.basic.color[0], sharedOptions.basic.color[4], sharedOptions.basic.color[5]],
timeline: {
show: false,
data: {!! $diagramsOptions['capacity']['xAxis']['data'] !!},
axisType: 'time',
replaceMerge: ['graphic', 'series']
},
legend: {
show: true
},
tooltip: sharedOptions.basic.tooltip,
grid: {
top: 20,
left: 25,
right: 10,
bottom: 20,
containLabel: true
},
xAxis: {
type: 'category',
boundaryGap: false,
name: 'Zeitpunkt Beobachtung',
nameLocation: 'center',
nameGap: 24,
nameTextStyle: {
fontWeight: 'bold',
},
},
yAxis: {
type: 'value',
min: 0,
max: 100,
name: 'Verfügbarkeit in %',
nameLocation: 'center',
nameGap: 38,
nameTextStyle: {
fontWeight: 'bold',
}
},
options: [
@foreach ($diagramsOptions['predictions']['options'] as $p)
@if($p === null)
{
graphic: {
elements: [
{
type: 'text',
left: 'center',
top: 'center',
style: {
text: 'Keine Daten für Zeitspanne',
fontSize: 44,
fontWeight: 'bold',
}
}
]
}
},
@else
{
color: sharedOptions.basic.color,
graphic: {
elements: []
},
xAxis: {
data: {!! json_encode($p['dates']) !!}
},
series: [
{
name: 'Gleitender Mittelwert',
showSymbol: false,
connectNulls: true,
type: 'line',
symbolSize: 7,
data: {!! json_encode($p['capacities_moving_average']) !!}
},
{
name: 'Ausgangsdaten',
showSymbol: false,
connectNulls: true,
type: 'line',
symbolSize: 7,
data: {!! json_encode($p['capacities_timeframe_before']) !!}
},
{
name: 'Vergleichsdaten',
showSymbol: false,
connectNulls: true,
type: 'line',
symbolSize: 7,
data: {!! json_encode($p['capacities_timeframe_after']) !!}
}
]
},
@endif
@endforeach
]
};
cPrediction.setOption(cPredictionOptions);
const chartHeatmap = document.getElementById('chart-heatmap');
const cHeatmap = echarts.init(chartHeatmap);
const cHeatmapOptions = {
animation: false,
tooltip: {
position: 'top'
},
grid: {
show: true,
borderWidth: 1,
borderColor: '#aaa',
top: 30,
right: 45,
bottom: 70,
left: 30
},
dataZoom: [{
type: 'slider'
},
{
type: 'slider',
show: true,
yAxisIndex: 0,
}],
xAxis: {
show: true,
name: 'Zeitpunkt Beobachtung',
type: 'category',
data: {!! $diagramsOptions['heatmap']['xAxis']['data'] !!},
splitArea: {
show: false
},
splitArea: {
show: false
},
axisLabel: {
show: false,
},
axisTick: {
show: false,
},
axisLine: {
show: false,
},
nameLocation: 'center',
nameGap: 10,
nameTextStyle: {
fontWeight: 'bold',
}
},
yAxis: {
show: true,
type: 'category',
data: {!! $diagramsOptions['heatmap']['yAxis']['data'] !!},
splitArea: {
show: false
},
axisTick: {
show: false,
},
axisLine: {
show: false,
},
axisLabel: {
show: false,
},
name: 'Mietobjekte',
nameLocation: 'center',
nameGap: 10,
nameTextStyle: {
fontWeight: 'bold',
}
},
visualMap: {
type: 'piecewise',
min: 0,
max: 100,
calculable: true,
orient: 'horizontal',
left: 'center',
top: 0,
formatter: (v1, v2) => {
return `${v1}${v2}%`;
},
inRange: {
color: sharedOptions.basic.color,
},
},
series: [
{
name: 'Verfügbarkeit',
type: 'heatmap',
blurSize: 0,
data: {!! $diagramsOptions['heatmap']['series']['data'] !!},
label: {
show: false
},
tooltip: {
formatter: (data) => {
return `Kurzzeitmietobjekte-ID: ${data.data[1]}<br />Beobachtungszeitpunkt: ${data.data[0]}<br/>Verfügbarkeit: ${data.data[2].toFixed(2)}%`
},
},
emphasis: {
itemStyle: {
borderColor: '#000',
borderWidth: 2
}
}
}
]
}
cHeatmap.setOption(cHeatmapOptions);
const chartTimeline = document.getElementById('timeline');
const cTimeline = echarts.init(chartTimeline);
const cTimelineOptions = {
grid: {
show: false,
},
timeline: {
data: {!! $diagramsOptions['capacity']['xAxis']['data'] !!},
playInterval: 2000,
axisType: 'time',
left: 8,
right: 8,
bottom: 0,
label: {
show: false
}
},
};
cTimeline.setOption(cTimelineOptions);
cTimeline.on('timelinechanged', (e) => {
let dateTitles = document.querySelectorAll('span.date');
dateTitles.forEach(el => {
el.innerText = cTimelineOptions.timeline.data[e.currentIndex];
});
// Set markpoint on linechart
let x = cCapacityOptions.xAxis.data[e.currentIndex];
let y = cCapacityOptions.series[0].data[e.currentIndex];
cCapacityMonthly.dispatchAction({
type: 'timelineChange',
currentIndex: e.currentIndex
});
cCapacityDaily.dispatchAction({
type: 'timelineChange',
currentIndex: e.currentIndex
});
cPrediction.dispatchAction({
type: 'timelineChange',
currentIndex: e.currentIndex
});
cCapacity.setOption({
series: {
markPoint: {
data: [{
coord: [x, y]
}]
}
}
});
})
document.querySelector('header').addEventListener('click', () => {
console.log('test');
cCapacityMonthly.dispatchAction({
type: 'timelineChange',
currentIndex: 10
});
})
cCapacity.on('click', 'series', (e) => {
// Switch to correct calendar in the timeline
cTimeline.dispatchAction({
type: 'timelineChange',
currentIndex: e.dataIndex
});
});
cHeatmap.on('click', 'series', (e) => {
window.open(`/property/${e.value[1]}?date=${e.value[0]}`, '_self');
})
</script>
@endsection

228
dashboard/routes/web.php Normal file
View File

@ -0,0 +1,228 @@
<?php
use Illuminate\Support\Facades\Route;
use App\Api;
use App\Chart;
Route::get('/', function () {
$regionBase = Api::regionBase(-1);
$regionPropertiesCapacities = Api::regionPropertiesCapacities(-1);
$propertiesGrowth = Api::propertiesGrowth();
$regions = Api::regions()['regions'];
$propertiesPerRegion = $regions;
$regions[] = ['name' => 'Alle Regionen', 'id' => -1];
$propertiesGeo = Api::propertiesGeo()['properties'];
$heatmapValues = [];
foreach ($regionPropertiesCapacities['values'] as $el) {
$heatmapValues[] = array_values($el);
}
$diagramsOptions = [
"shared" => [
"extractionDates" => json_encode($regionPropertiesCapacities['dates']),
"colors" => Chart::colors()
],
"heatmap" => [
"yAxis" => [
"data" => json_encode($regionPropertiesCapacities['property_ids'])
],
"series" => [
"data" => json_encode($heatmapValues)
]
],
"propertiesPerRegion" => [
"yAxis" => [
"data" => json_encode(array_column($propertiesPerRegion, 'count_properties'))
],
"xAxis" => [
"data" => json_encode(array_column($propertiesPerRegion, 'name'))
]
],
"extractions" => [
"series" => $propertiesGrowth,
]
];
return view('overview', [
"regions" => $regions,
"region" => $regionBase,
"diagramsOptions" => $diagramsOptions,
"geo" => $propertiesGeo,
]);
});
Route::get('/region/{id}', function (int $id) {
$regions = Api::regions()['regions'];
$regions[] = ['name' => 'Alle Regionen', 'id' => -1];
$region = $id >= 0 ? Api::regionBase($id) : ['name' => 'Alle Regionen'];
$regionPropertiesCapacities = Api::regionPropertiesCapacities($id);
$regionCapacitiesRegion = Api::regionCapacities($id);
$regionCapacitiesAll = Api::regionCapacities(-1);
$regionCapacitiesMonthly = [];
$regionCapacitiesDaily = [];
$regionPredictions = [];
$heatmapValues = [];
foreach ($regionPropertiesCapacities['values'] as $el) {
$heatmapValues[] = array_values($el);
}
foreach ($regionCapacitiesRegion['dates'] as $date) {
$regionCapacitiesMonthly[] = Api::regionCapacitiesMonthly($id, $date);
$regionCapacitiesDaily[] = Api::regionCapacitiesDaily($id, $date);
$regionPredictions[] = Api::regionMovingAverage($id, $date);
}
$diagramsOptions = [
"shared" => [
"extractionDates" => json_encode($regionPropertiesCapacities['dates']),
"colors" => Chart::colors()
],
"heatmap" => [
"xAxis" => [
"data" => json_encode($regionPropertiesCapacities['dates'])
],
"yAxis" => [
"data" => json_encode($regionPropertiesCapacities['property_ids'])
],
"series" => [
"data" => json_encode($heatmapValues)
]
],
"predictions" => [
"options" => $regionPredictions,
],
"capacityMonthly" => [
"options" => $regionCapacitiesMonthly,
],
"capacityDaily" => [
"options" => $regionCapacitiesDaily,
],
"capacity" => [
"xAxis" => [
"data" => json_encode($regionCapacitiesRegion['dates'])
],
"series" => [
"all" => [
"data" => json_encode($regionCapacitiesAll['capacities'])
],
"region" => [
"data" => json_encode($regionCapacitiesRegion['capacities'])
]
]
]
];
return view('region', [
'diagramsOptions' => $diagramsOptions,
'startDate' => $regionCapacitiesRegion['dates'][0],
'regions' => $regions,
'region' => $region,
'regionId' => $id,
'regionPropertiesCapacities' => $regionPropertiesCapacities,
'predictions' => $regionPredictions]);
});
Route::get('/property/{id}', function (int $id) {
$regions = Api::regions()['regions'];
$regions[] = ['name' => 'Alle Regionen', 'id' => -1];
$base = Api::propertyBase($id);
$calendars = Api::propertyExtractions($id)['extractions'];
$propertyCapacities = Api::propertyCapacities($id);
$propertyNeighbours = Api::propertyNeighbours($id)['neighbours'];
$regionCapacitiesRegion = Api::regionCapacities($base['region_id']);
$regionCapacitiesAll = Api::regionCapacities(-1);
$regionCapacities = [[],[]];
$propertyCapacitiesMonthly = [];
$propertyCapacitiesDaily = [];
if($propertyCapacities){
foreach ($propertyCapacities['dates'] as $date) {
$propertyCapacitiesMonthly[] = Api::propertyCapacitiesMonthly($id, $date);
$propertyCapacitiesDaily[] = Api::propertyCapacitiesDaily($id, $date);
}
// filter out all date, which were not scraped for the property
foreach ($regionCapacitiesAll['dates'] as $index => $date) {
if(in_array($date, $propertyCapacities['dates'])){
$regionCapacities[0][] = $regionCapacitiesAll['capacities'][$index];
}
}
foreach ($regionCapacitiesRegion['dates'] as $index => $date) {
if(in_array($date, $propertyCapacities['dates'])){
$regionCapacities[1][] = $regionCapacitiesRegion['capacities'][$index];
}
}
}else{
return view('property-nodata', [
'base' => $base,
'regions' => $regions,
]);
}
// prepare data for calendar chart
$calendarData = [];
foreach ($calendars as $el) {
$series = [];
$calendar = json_decode($el['calendar'], 1);
foreach ($calendar as $date => $status) {
$series[] = [$date, $status];
}
$calendarData[] = $series;
}
$diagramsOptions = [
"shared" => [
"colors" => Chart::colors(),
"extractiondates" => json_encode($propertyCapacities['dates'])
],
"calendar" => [
"series" => $calendarData
],
"capacities" => [
"xAxis" => [
"data" => json_encode($propertyCapacities['dates'])
],
"series" => [
["data" => json_encode($propertyCapacities['capacities'])],
["data" => json_encode($regionCapacities[0])],
["data" => json_encode($regionCapacities[1])],
]
],
"capacityMonthly" => [
"options" => $propertyCapacitiesMonthly,
],
"capacityDaily" => [
"options" => $propertyCapacitiesDaily,
],
];
return view('property', [
'diagramsOptions' => $diagramsOptions,
'startDate' => $propertyCapacities['dates'][0],
'base' => $base,
'regions' => $regions,
'neighbours' => $propertyNeighbours
]);
});

4
dashboard/storage/app/.gitignore vendored Normal file
View File

@ -0,0 +1,4 @@
*
!private/
!public/
!.gitignore

View File

@ -0,0 +1,2 @@
*
!.gitignore

View File

@ -0,0 +1,2 @@
*
!.gitignore

View File

@ -0,0 +1,9 @@
compiled.php
config.php
down
events.scanned.php
maintenance.php
routes.php
routes.scanned.php
schedule-*
services.json

View File

@ -0,0 +1,3 @@
*
!data/
!.gitignore

View File

@ -0,0 +1,2 @@
*
!.gitignore

View File

@ -0,0 +1,2 @@
*
!.gitignore

View File

@ -0,0 +1,2 @@
*
!.gitignore

View File

@ -0,0 +1,2 @@
*
!.gitignore

2
dashboard/storage/logs/.gitignore vendored Normal file
View File

@ -0,0 +1,2 @@
*
!.gitignore

View File

@ -0,0 +1,20 @@
import defaultTheme from 'tailwindcss/defaultTheme';
/** @type {import('tailwindcss').Config} */
export default {
content: [
'./vendor/laravel/framework/src/Illuminate/Pagination/resources/views/*.blade.php',
'./storage/framework/views/*.php',
'./resources/**/*.blade.php',
'./resources/**/*.js',
'./resources/**/*.vue',
],
theme: {
extend: {
fontFamily: {
sans: ['Figtree', ...defaultTheme.fontFamily.sans],
},
},
},
plugins: [],
};

View File

@ -0,0 +1,19 @@
<?php
namespace Tests\Feature;
// use Illuminate\Foundation\Testing\RefreshDatabase;
use Tests\TestCase;
class ExampleTest extends TestCase
{
/**
* A basic test example.
*/
public function test_the_application_returns_a_successful_response(): void
{
$response = $this->get('/');
$response->assertStatus(200);
}
}

View File

@ -0,0 +1,10 @@
<?php
namespace Tests;
use Illuminate\Foundation\Testing\TestCase as BaseTestCase;
abstract class TestCase extends BaseTestCase
{
//
}

View File

@ -0,0 +1,16 @@
<?php
namespace Tests\Unit;
use PHPUnit\Framework\TestCase;
class ExampleTest extends TestCase
{
/**
* A basic test example.
*/
public function test_that_true_is_true(): void
{
$this->assertTrue(true);
}
}

11
dashboard/vite.config.js Normal file
View File

@ -0,0 +1,11 @@
import { defineConfig } from 'vite';
import laravel from 'laravel-vite-plugin';
export default defineConfig({
plugins: [
laravel({
input: ['resources/css/app.css', 'resources/js/app.js'],
refresh: true,
}),
],
});

View File

@ -0,0 +1,142 @@
<mxfile host="app.diagrams.net" agent="Mozilla/5.0 (X11; Linux x86_64; rv:134.0) Gecko/20100101 Firefox/134.0" version="26.0.6">
<diagram name="Seite-1" id="WNMV8rePnVf-2Vz_xhjt">
<mxGraphModel dx="1937" dy="1185" grid="1" gridSize="10" guides="1" tooltips="1" connect="1" arrows="1" fold="1" page="1" pageScale="1" pageWidth="827" pageHeight="1169" math="0" shadow="0">
<root>
<mxCell id="0" />
<mxCell id="1" parent="0" />
<mxCell id="e6qn9whkbaCBCFCjUvdY-7" value="" style="rounded=0;whiteSpace=wrap;html=1;strokeColor=none;fillColor=#F5F5F5;" vertex="1" parent="1">
<mxGeometry x="10" y="420" width="1070" height="690" as="geometry" />
</mxCell>
<object placeholders="1" c4Name="ETL" c4Type="ContainerScopeBoundary" c4Application="Component" label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;&lt;div style=&quot;text-align: left&quot;&gt;%c4Name%&lt;/div&gt;&lt;/b&gt;&lt;/font&gt;&lt;div style=&quot;text-align: left&quot;&gt;[%c4Application%]&lt;/div&gt;" id="0Mexl9jQAquWokRCgHYt-11">
<mxCell style="rounded=1;fontSize=11;whiteSpace=wrap;html=1;dashed=1;arcSize=20;fillColor=default;strokeColor=#666666;fontColor=#333333;labelBackgroundColor=none;align=left;verticalAlign=bottom;labelBorderColor=none;spacingTop=0;spacing=10;dashPattern=8 4;metaEdit=1;rotatable=0;perimeter=rectanglePerimeter;noLabel=0;labelPadding=0;allowArrows=0;connectable=0;expand=0;recursiveResize=0;editable=1;pointerEvents=0;absoluteArcSize=1;points=[[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.75,1,0],[0.5,1,0],[0.25,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];" parent="1" vertex="1">
<mxGeometry x="30" y="440" width="1030" height="500" as="geometry" />
</mxCell>
</object>
<object placeholders="1" c4Name="Datenbank Aggregation" c4Type="Container" c4Technology="MySQL" c4Description="Datenbank welche während Aggregation verwendet wurde." label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%:&amp;nbsp;%c4Technology%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#E6E6E6&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="0Mexl9jQAquWokRCgHYt-1">
<mxCell style="shape=cylinder3;size=15;whiteSpace=wrap;html=1;boundedLbl=1;rounded=0;labelBackgroundColor=none;fillColor=#23A2D9;fontSize=12;fontColor=#ffffff;align=center;strokeColor=#0E7DAD;metaEdit=1;points=[[0.5,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.5,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];resizable=0;" parent="1" vertex="1">
<mxGeometry x="40" y="100" width="240" height="120" as="geometry" />
</mxCell>
</object>
<object placeholders="1" c4Name="Datenbank Analyse" c4Type="Container" c4Technology="DuckDB" c4Description="Datenbank, welcher für die Analysen&lt;br&gt; verwendet wurden." label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%:&amp;nbsp;%c4Technology%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#E6E6E6&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="0Mexl9jQAquWokRCgHYt-2">
<mxCell style="shape=cylinder3;size=15;whiteSpace=wrap;html=1;boundedLbl=1;rounded=0;labelBackgroundColor=none;fillColor=#23A2D9;fontSize=12;fontColor=#ffffff;align=center;strokeColor=#0E7DAD;metaEdit=1;points=[[0.5,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.5,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];resizable=0;" parent="1" vertex="1">
<mxGeometry x="790" y="100" width="240" height="120" as="geometry" />
</mxCell>
</object>
<mxCell id="0Mexl9jQAquWokRCgHYt-5" style="edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=0;exitY=0.5;exitDx=0;exitDy=0;exitPerimeter=0;dashed=1;dashPattern=8 8;" parent="1" source="0Mexl9jQAquWokRCgHYt-3" target="0Mexl9jQAquWokRCgHYt-1" edge="1">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="0Mexl9jQAquWokRCgHYt-7" value="liest Datenbank" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" parent="0Mexl9jQAquWokRCgHYt-5" vertex="1" connectable="0">
<mxGeometry x="-0.2497" y="-1" relative="1" as="geometry">
<mxPoint x="-10" y="1" as="offset" />
</mxGeometry>
</mxCell>
<object placeholders="1" c4Name="Sling" c4Type="sling-cli" c4Description="Kommandozeilenprogramm zur Migration von Datensätzen." label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#cccccc&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="0Mexl9jQAquWokRCgHYt-3">
<mxCell style="rounded=1;whiteSpace=wrap;html=1;labelBackgroundColor=none;fillColor=#1061B0;fontColor=#ffffff;align=center;arcSize=10;strokeColor=#0D5091;metaEdit=1;resizable=0;points=[[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.75,1,0],[0.5,1,0],[0.25,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];" parent="1" vertex="1">
<mxGeometry x="400" y="100" width="240" height="120" as="geometry" />
</mxCell>
</object>
<mxCell id="0Mexl9jQAquWokRCgHYt-6" style="edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;exitPerimeter=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;entryPerimeter=0;dashed=1;dashPattern=8 8;" parent="1" source="0Mexl9jQAquWokRCgHYt-3" target="0Mexl9jQAquWokRCgHYt-2" edge="1">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="0Mexl9jQAquWokRCgHYt-8" value="schreibt in Datenbank" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" parent="0Mexl9jQAquWokRCgHYt-6" vertex="1" connectable="0">
<mxGeometry x="-0.1744" relative="1" as="geometry">
<mxPoint x="12" y="-1" as="offset" />
</mxGeometry>
</mxCell>
<object placeholders="1" c4Name="Preprocessing" c4Type="ContainerScopeBoundary" c4Application="Component" label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;&lt;div style=&quot;text-align: left&quot;&gt;%c4Name%&lt;/div&gt;&lt;/b&gt;&lt;/font&gt;&lt;div style=&quot;text-align: left&quot;&gt;[%c4Application%]&lt;/div&gt;" id="0Mexl9jQAquWokRCgHYt-9">
<mxCell style="rounded=1;fontSize=11;whiteSpace=wrap;html=1;dashed=1;arcSize=20;fillColor=none;strokeColor=#666666;fontColor=#333333;labelBackgroundColor=none;align=left;verticalAlign=bottom;labelBorderColor=none;spacingTop=0;spacing=10;dashPattern=8 4;metaEdit=1;rotatable=0;perimeter=rectanglePerimeter;noLabel=0;labelPadding=0;allowArrows=0;connectable=0;expand=0;recursiveResize=0;editable=1;pointerEvents=0;absoluteArcSize=1;points=[[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.75,1,0],[0.5,1,0],[0.25,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];" parent="1" vertex="1">
<mxGeometry x="20" y="40" width="1030" height="270" as="geometry" />
</mxCell>
</object>
<object placeholders="1" c4Name="Datenbank" c4Type="Container" c4Technology="DuckDB" c4Description="Datenbank, welcher für die Analysen&lt;br&gt; verwendet wurden." label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%:&amp;nbsp;%c4Technology%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#E6E6E6&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="0Mexl9jQAquWokRCgHYt-10">
<mxCell style="shape=cylinder3;size=15;whiteSpace=wrap;html=1;boundedLbl=1;rounded=0;labelBackgroundColor=none;fillColor=#23A2D9;fontSize=12;fontColor=#ffffff;align=center;strokeColor=#0E7DAD;metaEdit=1;points=[[0.5,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.5,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];resizable=0;" parent="1" vertex="1">
<mxGeometry x="80" y="480" width="240" height="120" as="geometry" />
</mxCell>
</object>
<mxCell id="0Mexl9jQAquWokRCgHYt-23" style="rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;exitPerimeter=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;entryPerimeter=0;dashed=1;dashPattern=8 8;" parent="1" source="0Mexl9jQAquWokRCgHYt-12" target="0Mexl9jQAquWokRCgHYt-14" edge="1">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="0Mexl9jQAquWokRCgHYt-24" value="verwendet" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];fontSize=14;" parent="0Mexl9jQAquWokRCgHYt-23" vertex="1" connectable="0">
<mxGeometry x="-0.0114" y="-2" relative="1" as="geometry">
<mxPoint y="-2" as="offset" />
</mxGeometry>
</mxCell>
<object placeholders="1" c4Name="etl_*.py" c4Type="Python (Polars)" c4Description="Diverse Python Skripts zur Aufbereitung / Zusammenstellung der Daten." label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#cccccc&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="0Mexl9jQAquWokRCgHYt-12">
<mxCell style="rounded=1;whiteSpace=wrap;html=1;labelBackgroundColor=none;fillColor=#1061B0;fontColor=#ffffff;align=center;arcSize=10;strokeColor=#0D5091;metaEdit=1;resizable=0;points=[[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.75,1,0],[0.5,1,0],[0.25,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];" parent="1" vertex="1">
<mxGeometry x="430" y="710" width="240" height="120" as="geometry" />
</mxCell>
</object>
<mxCell id="0Mexl9jQAquWokRCgHYt-16" style="edgeStyle=orthogonalEdgeStyle;rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=0.5;exitY=0;exitDx=0;exitDy=0;exitPerimeter=0;dashed=1;dashPattern=8 8;" parent="1" source="0Mexl9jQAquWokRCgHYt-13" target="0Mexl9jQAquWokRCgHYt-10" edge="1">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="0Mexl9jQAquWokRCgHYt-17" value="liest Datenbank" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];fontSize=14;" parent="0Mexl9jQAquWokRCgHYt-16" vertex="1" connectable="0">
<mxGeometry x="-0.1633" relative="1" as="geometry">
<mxPoint as="offset" />
</mxGeometry>
</mxCell>
<object placeholders="1" c4Name="database.py" c4Type="Python (DuckDB Interface)" c4Description="Wrapper Skript zum Ausführen von SQL." label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#cccccc&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="0Mexl9jQAquWokRCgHYt-13">
<mxCell style="rounded=1;whiteSpace=wrap;html=1;labelBackgroundColor=none;fillColor=#1061B0;fontColor=#ffffff;align=center;arcSize=10;strokeColor=#0D5091;metaEdit=1;resizable=0;points=[[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.75,1,0],[0.5,1,0],[0.25,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];" parent="1" vertex="1">
<mxGeometry x="80" y="710" width="240" height="120" as="geometry" />
</mxCell>
</object>
<mxCell id="0Mexl9jQAquWokRCgHYt-18" style="rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=0.25;exitY=0;exitDx=0;exitDy=0;exitPerimeter=0;dashed=1;dashPattern=8 8;entryX=0.24;entryY=0.981;entryDx=0;entryDy=0;entryPerimeter=0;" parent="1" source="0Mexl9jQAquWokRCgHYt-14" target="0Mexl9jQAquWokRCgHYt-15" edge="1">
<mxGeometry relative="1" as="geometry">
<mxPoint x="900" y="600" as="targetPoint" />
</mxGeometry>
</mxCell>
<mxCell id="0Mexl9jQAquWokRCgHYt-19" value="schreibt pickle objekt" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];fontSize=14;" parent="0Mexl9jQAquWokRCgHYt-18" vertex="1" connectable="0">
<mxGeometry x="-0.1818" y="2" relative="1" as="geometry">
<mxPoint as="offset" />
</mxGeometry>
</mxCell>
<object placeholders="1" c4Name="etl_cache.py" c4Type="Python (Pickle)" c4Description="Diverse Python Skripts zur Aufbereitung / Zusammenstellung der Daten." label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#cccccc&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="0Mexl9jQAquWokRCgHYt-14">
<mxCell style="rounded=1;whiteSpace=wrap;html=1;labelBackgroundColor=none;fillColor=#1061B0;fontColor=#ffffff;align=center;arcSize=10;strokeColor=#0D5091;metaEdit=1;resizable=0;points=[[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.75,1,0],[0.5,1,0],[0.25,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];" parent="1" vertex="1">
<mxGeometry x="780" y="710" width="240" height="120" as="geometry" />
</mxCell>
</object>
<object placeholders="1" c4Name="Cache" c4Type="Container" c4Technology="Filesystem" c4Description="Das Dateisystem wird als Pufferspeicher verwendet." label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%:&amp;nbsp;%c4Technology%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#E6E6E6&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="0Mexl9jQAquWokRCgHYt-15">
<mxCell style="shape=cylinder3;size=15;whiteSpace=wrap;html=1;boundedLbl=1;rounded=0;labelBackgroundColor=none;fillColor=#23A2D9;fontSize=12;fontColor=#ffffff;align=center;strokeColor=#0E7DAD;metaEdit=1;points=[[0.5,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.5,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];resizable=0;" parent="1" vertex="1">
<mxGeometry x="780" y="480" width="240" height="120" as="geometry" />
</mxCell>
</object>
<mxCell id="0Mexl9jQAquWokRCgHYt-20" style="rounded=0;orthogonalLoop=1;jettySize=auto;html=1;entryX=0.746;entryY=1.002;entryDx=0;entryDy=0;entryPerimeter=0;dashed=1;dashPattern=8 8;exitX=0.75;exitY=0;exitDx=0;exitDy=0;exitPerimeter=0;" parent="1" source="0Mexl9jQAquWokRCgHYt-14" target="0Mexl9jQAquWokRCgHYt-15" edge="1">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="0Mexl9jQAquWokRCgHYt-21" value="liest pickle objekt" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];fontSize=14;" parent="0Mexl9jQAquWokRCgHYt-20" vertex="1" connectable="0">
<mxGeometry x="-0.1076" y="1" relative="1" as="geometry">
<mxPoint x="8" y="-11" as="offset" />
</mxGeometry>
</mxCell>
<mxCell id="0Mexl9jQAquWokRCgHYt-25" style="rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=0;exitY=0.5;exitDx=0;exitDy=0;exitPerimeter=0;entryX=1;entryY=0.5;entryDx=0;entryDy=0;entryPerimeter=0;dashed=1;dashPattern=8 8;" parent="1" source="0Mexl9jQAquWokRCgHYt-12" target="0Mexl9jQAquWokRCgHYt-13" edge="1">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="0Mexl9jQAquWokRCgHYt-26" value="verwendet" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];fontSize=14;" parent="0Mexl9jQAquWokRCgHYt-25" vertex="1" connectable="0">
<mxGeometry x="0.0473" relative="1" as="geometry">
<mxPoint as="offset" />
</mxGeometry>
</mxCell>
<mxCell id="e6qn9whkbaCBCFCjUvdY-3" style="rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=0.5;exitY=0;exitDx=0;exitDy=0;exitPerimeter=0;entryX=0.5;entryY=1;entryDx=0;entryDy=0;entryPerimeter=0;dashed=1;dashPattern=8 8;" edge="1" parent="1" source="e6qn9whkbaCBCFCjUvdY-1" target="0Mexl9jQAquWokRCgHYt-13">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="e6qn9whkbaCBCFCjUvdY-6" value="führt Funktionen aus" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];fontSize=14;" vertex="1" connectable="0" parent="e6qn9whkbaCBCFCjUvdY-3">
<mxGeometry x="0.0906" relative="1" as="geometry">
<mxPoint y="1" as="offset" />
</mxGeometry>
</mxCell>
<object placeholders="1" c4Name="FastAPI" c4Type="Python (FastAPI)" c4Description="Stellt die aufbereiteten Daten über eine JSON-Schnittstelle zur Verfügung." label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#cccccc&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="e6qn9whkbaCBCFCjUvdY-1">
<mxCell style="rounded=1;whiteSpace=wrap;html=1;labelBackgroundColor=none;fillColor=#1061B0;fontColor=#ffffff;align=center;arcSize=10;strokeColor=#0D5091;metaEdit=1;resizable=0;points=[[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.75,1,0],[0.5,1,0],[0.25,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];" vertex="1" parent="1">
<mxGeometry x="430" y="970" width="240" height="120" as="geometry" />
</mxCell>
</object>
<mxCell id="e6qn9whkbaCBCFCjUvdY-2" style="rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=0.5;exitY=0;exitDx=0;exitDy=0;exitPerimeter=0;entryX=0.5;entryY=1;entryDx=0;entryDy=0;entryPerimeter=0;dashed=1;dashPattern=8 8;" edge="1" parent="1" source="e6qn9whkbaCBCFCjUvdY-1" target="0Mexl9jQAquWokRCgHYt-12">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="e6qn9whkbaCBCFCjUvdY-5" value="führt Funktionen aus" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];fontSize=14;" vertex="1" connectable="0" parent="e6qn9whkbaCBCFCjUvdY-2">
<mxGeometry x="0.0286" y="-1" relative="1" as="geometry">
<mxPoint as="offset" />
</mxGeometry>
</mxCell>
</root>
</mxGraphModel>
</diagram>
</mxfile>

View File

@ -0,0 +1,187 @@
<mxfile host="app.diagrams.net" agent="Mozilla/5.0 (X11; Linux x86_64; rv:134.0) Gecko/20100101 Firefox/134.0" version="26.0.6" pages="2">
<diagram name="Seite-1" id="chpUGVRRn7alPJZ1I-il">
<mxGraphModel dx="1291" dy="790" grid="1" gridSize="10" guides="1" tooltips="1" connect="1" arrows="1" fold="1" page="1" pageScale="1" pageWidth="827" pageHeight="1169" math="0" shadow="0">
<root>
<mxCell id="0" />
<mxCell id="1" parent="0" />
<mxCell id="tzVNFCieMdwak3VSEkXc-1" value="" style="rounded=0;whiteSpace=wrap;html=1;strokeColor=none;fillColor=#F5F5F5;" vertex="1" parent="1">
<mxGeometry x="10" y="20" width="750" height="780" as="geometry" />
</mxCell>
<object placeholders="1" c4Name="Visual Analytics Tool" c4Type="SystemScopeBoundary" c4Application="Software System" label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;&lt;div style=&quot;text-align: left&quot;&gt;%c4Name%&lt;/div&gt;&lt;/b&gt;&lt;/font&gt;&lt;div style=&quot;text-align: left&quot;&gt;[%c4Application%]&lt;/div&gt;" id="_wAeSdXpbb6KPP4DEc36-23">
<mxCell style="rounded=1;fontSize=11;whiteSpace=wrap;html=1;dashed=1;arcSize=20;fillColor=default;strokeColor=#666666;fontColor=#333333;labelBackgroundColor=none;align=left;verticalAlign=bottom;labelBorderColor=none;spacingTop=0;spacing=10;dashPattern=8 4;metaEdit=1;rotatable=0;perimeter=rectanglePerimeter;noLabel=0;labelPadding=0;allowArrows=0;connectable=0;expand=0;recursiveResize=0;editable=1;pointerEvents=0;absoluteArcSize=1;points=[[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.75,1,0],[0.5,1,0],[0.25,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];" parent="1" vertex="1">
<mxGeometry x="30" y="40" width="710" height="540" as="geometry" />
</mxCell>
</object>
<object placeholders="1" c4Name="Datenbank" c4Type="Container" c4Technology="DuckDB" c4Description="Aggregierte Daten." label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%:&amp;nbsp;%c4Technology%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#E6E6E6&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="_wAeSdXpbb6KPP4DEc36-2">
<mxCell style="shape=cylinder3;size=15;whiteSpace=wrap;html=1;boundedLbl=1;rounded=0;labelBackgroundColor=none;fillColor=#23A2D9;fontSize=12;fontColor=#ffffff;align=center;strokeColor=#0E7DAD;metaEdit=1;points=[[0.5,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.5,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];resizable=0;" parent="1" vertex="1">
<mxGeometry x="50" y="60" width="240" height="120" as="geometry" />
</mxCell>
</object>
<object placeholders="1" c4Name="ETL" c4Type="SQL, Python (Polars)" c4Description="Bereitet Daten mittels algorithmischer&lt;br&gt; Verfahren auf." label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#cccccc&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="_wAeSdXpbb6KPP4DEc36-3">
<mxCell style="rounded=1;whiteSpace=wrap;html=1;labelBackgroundColor=none;fillColor=#1061B0;fontColor=#ffffff;align=center;arcSize=10;strokeColor=#0D5091;metaEdit=1;resizable=0;points=[[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.75,1,0],[0.5,1,0],[0.25,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];" parent="1" vertex="1">
<mxGeometry x="480" y="60" width="240" height="120" as="geometry" />
</mxCell>
</object>
<mxCell id="_wAeSdXpbb6KPP4DEc36-4" style="rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=0;exitY=0.5;exitDx=0;exitDy=0;exitPerimeter=0;entryX=1;entryY=0.5;entryDx=0;entryDy=0;entryPerimeter=0;dashed=1;dashPattern=8 8;" parent="1" source="_wAeSdXpbb6KPP4DEc36-3" target="_wAeSdXpbb6KPP4DEc36-2" edge="1">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="_wAeSdXpbb6KPP4DEc36-5" value="Liest Datenbank" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" parent="_wAeSdXpbb6KPP4DEc36-4" vertex="1" connectable="0">
<mxGeometry x="0.0412" y="1" relative="1" as="geometry">
<mxPoint x="-1" y="-1" as="offset" />
</mxGeometry>
</mxCell>
<mxCell id="_wAeSdXpbb6KPP4DEc36-15" style="rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;exitPerimeter=0;dashed=1;dashPattern=8 8;entryX=0;entryY=0.5;entryDx=0;entryDy=0;entryPerimeter=0;" parent="1" source="_wAeSdXpbb6KPP4DEc36-6" target="_wAeSdXpbb6KPP4DEc36-13" edge="1">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="_wAeSdXpbb6KPP4DEc36-21" value="&lt;div&gt;Führt Abfragen aus&lt;/div&gt;&lt;div&gt;[JSON/HTTPS]&lt;br&gt;&lt;/div&gt;" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" parent="_wAeSdXpbb6KPP4DEc36-15" vertex="1" connectable="0">
<mxGeometry x="-0.0541" y="-1" relative="1" as="geometry">
<mxPoint as="offset" />
</mxGeometry>
</mxCell>
<object placeholders="1" c4Name="Webapplikation" c4Type="PHP (Laravel)" c4Description="Verarbeitet Anfragen von Benutzer:innen" label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#cccccc&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="_wAeSdXpbb6KPP4DEc36-6">
<mxCell style="rounded=1;whiteSpace=wrap;html=1;labelBackgroundColor=none;fillColor=#1061B0;fontColor=#ffffff;align=center;arcSize=10;strokeColor=#0D5091;metaEdit=1;resizable=0;points=[[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.75,1,0],[0.5,1,0],[0.25,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];" parent="1" vertex="1">
<mxGeometry x="50" y="230" width="240" height="120" as="geometry" />
</mxCell>
</object>
<object placeholders="1" c4Name="Dashboard" c4Type="Container" c4Technology="Apache Echarts" c4Description="Stellt Benutzer:innen Auswertungs-&lt;br&gt;möglichkeiten zur Verfügbarkeit von Kurzzeitmietobjekten." label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%:&amp;nbsp;%c4Technology%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#E6E6E6&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="_wAeSdXpbb6KPP4DEc36-8">
<mxCell style="shape=mxgraph.c4.webBrowserContainer2;whiteSpace=wrap;html=1;boundedLbl=1;rounded=0;labelBackgroundColor=none;strokeColor=#118ACD;fillColor=#23A2D9;strokeColor=#118ACD;strokeColor2=#0E7DAD;fontSize=12;fontColor=#ffffff;align=center;metaEdit=1;points=[[0.5,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.5,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];resizable=0;" parent="1" vertex="1">
<mxGeometry x="480" y="370" width="240" height="160" as="geometry" />
</mxCell>
</object>
<mxCell id="_wAeSdXpbb6KPP4DEc36-10" style="rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=0.5;exitY=0;exitDx=0;exitDy=0;exitPerimeter=0;dashed=1;dashPattern=8 8;" parent="1" source="_wAeSdXpbb6KPP4DEc36-9" target="_wAeSdXpbb6KPP4DEc36-6" edge="1">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="_wAeSdXpbb6KPP4DEc36-16" value="&lt;div&gt;Besucht Webapplikation&lt;/div&gt;&lt;div&gt;[HTTPS]&lt;br&gt;&lt;/div&gt;" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" parent="_wAeSdXpbb6KPP4DEc36-10" vertex="1" connectable="0">
<mxGeometry x="0.1247" y="-2" relative="1" as="geometry">
<mxPoint as="offset" />
</mxGeometry>
</mxCell>
<mxCell id="_wAeSdXpbb6KPP4DEc36-11" style="rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=0.5;exitY=0;exitDx=0;exitDy=0;exitPerimeter=0;dashed=1;dashPattern=8 8;" parent="1" source="_wAeSdXpbb6KPP4DEc36-9" target="_wAeSdXpbb6KPP4DEc36-8" edge="1">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="_wAeSdXpbb6KPP4DEc36-17" value="Betrachtet Auswertungen" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" parent="_wAeSdXpbb6KPP4DEc36-11" vertex="1" connectable="0">
<mxGeometry x="0.2151" y="-1" relative="1" as="geometry">
<mxPoint x="2" as="offset" />
</mxGeometry>
</mxCell>
<object placeholders="1" c4Name="Benutzer:in" c4Type="Person" c4Description="Person welche Auswertungen zur Verfügbarkeit von Kurzzeitmietobjekten in Ferienregionen durchführt." label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#cccccc&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="_wAeSdXpbb6KPP4DEc36-9">
<mxCell style="html=1;fontSize=11;dashed=0;whiteSpace=wrap;fillColor=#083F75;strokeColor=#06315C;fontColor=#ffffff;shape=mxgraph.c4.person2;align=center;metaEdit=1;points=[[0.5,0,0],[1,0.5,0],[1,0.75,0],[0.75,1,0],[0.5,1,0],[0.25,1,0],[0,0.75,0],[0,0.5,0]];resizable=0;" parent="1" vertex="1">
<mxGeometry x="314" y="600" width="200" height="180" as="geometry" />
</mxCell>
</object>
<mxCell id="_wAeSdXpbb6KPP4DEc36-14" style="rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=0.5;exitY=0;exitDx=0;exitDy=0;exitPerimeter=0;entryX=0.5;entryY=1;entryDx=0;entryDy=0;entryPerimeter=0;dashed=1;dashPattern=8 8;" parent="1" source="_wAeSdXpbb6KPP4DEc36-13" target="_wAeSdXpbb6KPP4DEc36-3" edge="1">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="_wAeSdXpbb6KPP4DEc36-20" value="Ruft ETL Verfahren auf" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" parent="_wAeSdXpbb6KPP4DEc36-14" vertex="1" connectable="0">
<mxGeometry x="-0.0667" y="-1" relative="1" as="geometry">
<mxPoint as="offset" />
</mxGeometry>
</mxCell>
<object placeholders="1" c4Name="FastAPI" c4Type="Python (FastAPI)" c4Description="Stellt aufbereitete Daten via &lt;br&gt;JSON/HTTPS API zur Verfügung." label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#cccccc&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="_wAeSdXpbb6KPP4DEc36-13">
<mxCell style="rounded=1;whiteSpace=wrap;html=1;labelBackgroundColor=none;fillColor=#1061B0;fontColor=#ffffff;align=center;arcSize=10;strokeColor=#0D5091;metaEdit=1;resizable=0;points=[[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.75,1,0],[0.5,1,0],[0.25,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];" parent="1" vertex="1">
<mxGeometry x="480" y="230" width="240" height="120" as="geometry" />
</mxCell>
</object>
<mxCell id="_wAeSdXpbb6KPP4DEc36-18" style="rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=1;exitY=0.5;exitDx=0;exitDy=0;exitPerimeter=0;entryX=0;entryY=0.5;entryDx=0;entryDy=0;entryPerimeter=0;dashed=1;dashPattern=8 8;" parent="1" source="_wAeSdXpbb6KPP4DEc36-6" target="_wAeSdXpbb6KPP4DEc36-8" edge="1">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="_wAeSdXpbb6KPP4DEc36-19" value="&lt;div&gt;Liefert Inhalte zum Webbrowser&amp;nbsp;&lt;/div&gt;&lt;div&gt;von Benutzer:innen&lt;/div&gt;" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" parent="_wAeSdXpbb6KPP4DEc36-18" vertex="1" connectable="0">
<mxGeometry x="-0.0888" y="-2" relative="1" as="geometry">
<mxPoint x="5" y="7" as="offset" />
</mxGeometry>
</mxCell>
</root>
</mxGraphModel>
</diagram>
<diagram id="2goo0GJ--Dnj9rEJibSb" name="Seite-2">
<mxGraphModel dx="2285" dy="1267" grid="1" gridSize="10" guides="1" tooltips="1" connect="1" arrows="1" fold="1" page="1" pageScale="1" pageWidth="827" pageHeight="1169" math="0" shadow="0">
<root>
<mxCell id="0" />
<mxCell id="1" parent="0" />
<object placeholders="1" c4Name="RDBMS" c4Type="Container" c4Technology="DuckDB" c4Description="Aggregierte Daten." label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%:&amp;nbsp;%c4Technology%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#E6E6E6&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="Xmw1x83A06H2_JC6hK8s-1">
<mxCell style="shape=cylinder3;size=15;whiteSpace=wrap;html=1;boundedLbl=1;rounded=0;labelBackgroundColor=none;fillColor=#23A2D9;fontSize=12;fontColor=#ffffff;align=center;strokeColor=#0E7DAD;metaEdit=1;points=[[0.5,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.5,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];resizable=0;" vertex="1" parent="1">
<mxGeometry x="40" y="230" width="240" height="120" as="geometry" />
</mxCell>
</object>
<object placeholders="1" c4Name="ETL" c4Type="SQL, Python (Polars)" c4Description="Bereitet Daten mittels algorithmischer&lt;br&gt; Verfahren auf." label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#cccccc&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="Xmw1x83A06H2_JC6hK8s-2">
<mxCell style="rounded=1;whiteSpace=wrap;html=1;labelBackgroundColor=none;fillColor=#1061B0;fontColor=#ffffff;align=center;arcSize=10;strokeColor=#0D5091;metaEdit=1;resizable=0;points=[[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.75,1,0],[0.5,1,0],[0.25,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];" vertex="1" parent="1">
<mxGeometry x="40" y="464" width="240" height="120" as="geometry" />
</mxCell>
</object>
<mxCell id="Xmw1x83A06H2_JC6hK8s-3" style="rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=0.5;exitY=0;exitDx=0;exitDy=0;exitPerimeter=0;entryX=0.5;entryY=1;entryDx=0;entryDy=0;entryPerimeter=0;dashed=1;dashPattern=8 8;" edge="1" parent="1" source="Xmw1x83A06H2_JC6hK8s-2" target="Xmw1x83A06H2_JC6hK8s-1">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="Xmw1x83A06H2_JC6hK8s-4" value="Liest Datenbank" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" vertex="1" connectable="0" parent="Xmw1x83A06H2_JC6hK8s-3">
<mxGeometry x="0.0412" y="1" relative="1" as="geometry">
<mxPoint x="-1" y="-1" as="offset" />
</mxGeometry>
</mxCell>
<mxCell id="Xmw1x83A06H2_JC6hK8s-5" style="rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=0;exitY=0.5;exitDx=0;exitDy=0;exitPerimeter=0;dashed=1;dashPattern=8 8;entryX=1;entryY=0.5;entryDx=0;entryDy=0;entryPerimeter=0;" edge="1" parent="1" source="Xmw1x83A06H2_JC6hK8s-7" target="Xmw1x83A06H2_JC6hK8s-16">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="Xmw1x83A06H2_JC6hK8s-6" value="&lt;div&gt;Führt Abfragen aus&lt;/div&gt;&lt;div&gt;[JSON/HTTPS]&lt;br&gt;&lt;/div&gt;" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" vertex="1" connectable="0" parent="Xmw1x83A06H2_JC6hK8s-5">
<mxGeometry x="-0.0541" y="-1" relative="1" as="geometry">
<mxPoint as="offset" />
</mxGeometry>
</mxCell>
<object placeholders="1" c4Name="Webapplikation" c4Type="PHP (Laravel)" c4Description="Verarbeitet Anfragen von Benutzer:innen" label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#cccccc&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="Xmw1x83A06H2_JC6hK8s-7">
<mxCell style="rounded=1;whiteSpace=wrap;html=1;labelBackgroundColor=none;fillColor=#1061B0;fontColor=#ffffff;align=center;arcSize=10;strokeColor=#0D5091;metaEdit=1;resizable=0;points=[[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.75,1,0],[0.5,1,0],[0.25,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];" vertex="1" parent="1">
<mxGeometry x="710" y="240" width="240" height="120" as="geometry" />
</mxCell>
</object>
<object placeholders="1" c4Name="Dashboard" c4Type="Container" c4Technology="Apache Echarts" c4Description="Stellt Benutzer:innen Auswertungs-&lt;br&gt;möglichkeiten zur Verfügbarkeit von Kurzzeitmietobjekten." label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%:&amp;nbsp;%c4Technology%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#E6E6E6&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="Xmw1x83A06H2_JC6hK8s-8">
<mxCell style="shape=mxgraph.c4.webBrowserContainer2;whiteSpace=wrap;html=1;boundedLbl=1;rounded=0;labelBackgroundColor=none;strokeColor=#118ACD;fillColor=#23A2D9;strokeColor=#118ACD;strokeColor2=#0E7DAD;fontSize=12;fontColor=#ffffff;align=center;metaEdit=1;points=[[0.5,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.5,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];resizable=0;" vertex="1" parent="1">
<mxGeometry x="710" y="470" width="240" height="160" as="geometry" />
</mxCell>
</object>
<mxCell id="Xmw1x83A06H2_JC6hK8s-9" style="rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=0;exitY=0.5;exitDx=0;exitDy=0;exitPerimeter=0;dashed=1;dashPattern=8 8;entryX=1;entryY=0.5;entryDx=0;entryDy=0;entryPerimeter=0;" edge="1" parent="1" source="Xmw1x83A06H2_JC6hK8s-13" target="Xmw1x83A06H2_JC6hK8s-7">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="Xmw1x83A06H2_JC6hK8s-10" value="&lt;div&gt;Besucht Webapplikation&lt;/div&gt;&lt;div&gt;[HTTPS]&lt;br&gt;&lt;/div&gt;" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" vertex="1" connectable="0" parent="Xmw1x83A06H2_JC6hK8s-9">
<mxGeometry x="0.1247" y="-2" relative="1" as="geometry">
<mxPoint x="4" y="4" as="offset" />
</mxGeometry>
</mxCell>
<mxCell id="Xmw1x83A06H2_JC6hK8s-11" style="rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=0;exitY=0.5;exitDx=0;exitDy=0;exitPerimeter=0;dashed=1;dashPattern=8 8;entryX=1;entryY=0.5;entryDx=0;entryDy=0;entryPerimeter=0;" edge="1" parent="1" source="Xmw1x83A06H2_JC6hK8s-13" target="Xmw1x83A06H2_JC6hK8s-8">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="Xmw1x83A06H2_JC6hK8s-12" value="Betrachtet Auswertungen" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" vertex="1" connectable="0" parent="Xmw1x83A06H2_JC6hK8s-11">
<mxGeometry x="0.2151" y="-1" relative="1" as="geometry">
<mxPoint x="13" as="offset" />
</mxGeometry>
</mxCell>
<object placeholders="1" c4Name="Benutzer:in" c4Type="Person" c4Description="Person welche Auswertungen zur Verfügbarkeit von Kurzzeitmietobjekten in Ferienregionen durchführt." label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#cccccc&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="Xmw1x83A06H2_JC6hK8s-13">
<mxCell style="html=1;fontSize=11;dashed=0;whiteSpace=wrap;fillColor=#083F75;strokeColor=#06315C;fontColor=#ffffff;shape=mxgraph.c4.person2;align=center;metaEdit=1;points=[[0.5,0,0],[1,0.5,0],[1,0.75,0],[0.75,1,0],[0.5,1,0],[0.25,1,0],[0,0.75,0],[0,0.5,0]];resizable=0;" vertex="1" parent="1">
<mxGeometry x="1120" y="320" width="200" height="180" as="geometry" />
</mxCell>
</object>
<mxCell id="Xmw1x83A06H2_JC6hK8s-14" style="rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=0;exitY=0.5;exitDx=0;exitDy=0;exitPerimeter=0;entryX=1;entryY=0.5;entryDx=0;entryDy=0;entryPerimeter=0;dashed=1;dashPattern=8 8;" edge="1" parent="1" source="Xmw1x83A06H2_JC6hK8s-16" target="Xmw1x83A06H2_JC6hK8s-2">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="Xmw1x83A06H2_JC6hK8s-15" value="Ruft ETL Verfahren auf" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" vertex="1" connectable="0" parent="Xmw1x83A06H2_JC6hK8s-14">
<mxGeometry x="-0.0667" y="-1" relative="1" as="geometry">
<mxPoint as="offset" />
</mxGeometry>
</mxCell>
<object placeholders="1" c4Name="FastAPI" c4Type="Python (FastAPI)" c4Description="Stellt aufbereitete Daten via &lt;br&gt;JSON/HTTPS API zur Verfügung." label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;%c4Name%&lt;/b&gt;&lt;/font&gt;&lt;div&gt;[%c4Type%]&lt;/div&gt;&lt;br&gt;&lt;div&gt;&lt;font style=&quot;font-size: 11px&quot;&gt;&lt;font color=&quot;#cccccc&quot;&gt;%c4Description%&lt;/font&gt;&lt;/div&gt;" id="Xmw1x83A06H2_JC6hK8s-16">
<mxCell style="rounded=1;whiteSpace=wrap;html=1;labelBackgroundColor=none;fillColor=#1061B0;fontColor=#ffffff;align=center;arcSize=10;strokeColor=#0D5091;metaEdit=1;resizable=0;points=[[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.75,1,0],[0.5,1,0],[0.25,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];" vertex="1" parent="1">
<mxGeometry x="330" y="240" width="240" height="120" as="geometry" />
</mxCell>
</object>
<mxCell id="Xmw1x83A06H2_JC6hK8s-17" style="rounded=0;orthogonalLoop=1;jettySize=auto;html=1;exitX=0.5;exitY=1;exitDx=0;exitDy=0;exitPerimeter=0;entryX=0.5;entryY=0;entryDx=0;entryDy=0;entryPerimeter=0;dashed=1;dashPattern=8 8;" edge="1" parent="1" source="Xmw1x83A06H2_JC6hK8s-7" target="Xmw1x83A06H2_JC6hK8s-8">
<mxGeometry relative="1" as="geometry" />
</mxCell>
<mxCell id="Xmw1x83A06H2_JC6hK8s-18" value="&lt;div&gt;Liefert Inhalte zum Webbrowser&amp;nbsp;&lt;/div&gt;&lt;div&gt;von Benutzer:innen&lt;/div&gt;" style="edgeLabel;html=1;align=center;verticalAlign=middle;resizable=0;points=[];" vertex="1" connectable="0" parent="Xmw1x83A06H2_JC6hK8s-17">
<mxGeometry x="-0.0888" y="-2" relative="1" as="geometry">
<mxPoint x="5" as="offset" />
</mxGeometry>
</mxCell>
<object placeholders="1" c4Name="Visual Analytics Tool" c4Type="SystemScopeBoundary" c4Application="Software System" label="&lt;font style=&quot;font-size: 16px&quot;&gt;&lt;b&gt;&lt;div style=&quot;text-align: left&quot;&gt;%c4Name%&lt;/div&gt;&lt;/b&gt;&lt;/font&gt;&lt;div style=&quot;text-align: left&quot;&gt;[%c4Application%]&lt;/div&gt;" id="Xmw1x83A06H2_JC6hK8s-19">
<mxCell style="rounded=1;fontSize=11;whiteSpace=wrap;html=1;dashed=1;arcSize=20;fillColor=none;strokeColor=#666666;fontColor=#333333;labelBackgroundColor=none;align=left;verticalAlign=bottom;labelBorderColor=none;spacingTop=0;spacing=10;dashPattern=8 4;metaEdit=1;rotatable=0;perimeter=rectanglePerimeter;noLabel=0;labelPadding=0;allowArrows=0;connectable=0;expand=0;recursiveResize=0;editable=1;pointerEvents=0;absoluteArcSize=1;points=[[0.25,0,0],[0.5,0,0],[0.75,0,0],[1,0.25,0],[1,0.5,0],[1,0.75,0],[0.75,1,0],[0.5,1,0],[0.25,1,0],[0,0.75,0],[0,0.5,0],[0,0.25,0]];" vertex="1" parent="1">
<mxGeometry x="20" y="210" width="1080" height="460" as="geometry" />
</mxCell>
</object>
</root>
</mxGraphModel>
</diagram>
</mxfile>

25
etl/README.md Normal file
View File

@ -0,0 +1,25 @@
## Installation
Folgende Schritte zur Installation vornehmen
### Abhängigkeiten installieren
Zur Verwaltung der Abhängigkeiten wird [pixi](https://pixi.sh/) verwendet.
```bash
pixi install
```
### Datenbankverbindung konfigurieren
Enviroment File erstellen:
```bash
cp src/.env.example .env
```
Im erstellten .env File die Datei anpassen:
```
DATABASE="/path/to/db.duckdb"
```
# FastAPI starten
FastAPI auf einem anderen Port ausführen als das Dashboard.
```bash
fastapi dev api/main.py --port 8080
```

View File

@ -2196,13 +2196,6 @@ packages:
- pkg:pypi/colorama?source=hash-mapping - pkg:pypi/colorama?source=hash-mapping
size: 25170 size: 25170
timestamp: 1666700778190 timestamp: 1666700778190
- kind: pypi
name: consultancy-2
version: 0.1.0
path: .
sha256: 390e1115c19758a67a2876388f5a8fe69abc3609e68910e50ccb86a558ee67ee
requires_python: '>=3.11'
editable: true
- kind: conda - kind: conda
name: dnspython name: dnspython
version: 2.7.0 version: 2.7.0
@ -2280,6 +2273,13 @@ packages:
purls: [] purls: []
size: 6690 size: 6690
timestamp: 1718984720419 timestamp: 1718984720419
- kind: pypi
name: etl
version: 0.1.0
path: .
sha256: d682071d587e9be1fcf91237a1add69a92c34715bc491a4067b07d63ce79616d
requires_python: '>=3.11'
editable: true
- kind: conda - kind: conda
name: exceptiongroup name: exceptiongroup
version: 1.2.2 version: 1.2.2

View File

@ -1,8 +1,7 @@
[project] [project]
authors = [{name = "Giò Diani", email = "mail@gionathandiani.name"}] authors = [{name = "Giò Diani", email = "mail@gionathandiani.name"}, {name = "Mauro Stoffel", email = "mauro.stoffel@stud.fhgr.ch"}, {name = "Colin Bolli", email = "colin.bolli@stud.fhgr.ch"}, {name = "Charles Winkler", email = "charles.winkler@stud.fhgr.ch"}]
dependencies = [] description = "Datenauferbeitung"
description = "Add a short description here" name = "ETL"
name = "consultancy_2"
requires-python = ">= 3.11" requires-python = ">= 3.11"
version = "0.1.0" version = "0.1.0"
@ -15,7 +14,7 @@ channels = ["conda-forge"]
platforms = ["win-64", "linux-64", "osx-64", "osx-arm64"] platforms = ["win-64", "linux-64", "osx-64", "osx-arm64"]
[tool.pixi.pypi-dependencies] [tool.pixi.pypi-dependencies]
consultancy_2 = { path = ".", editable = true } etl = { path = ".", editable = true }
[tool.pixi.tasks] [tool.pixi.tasks]

268
etl/src/api/main.py Normal file
View File

@ -0,0 +1,268 @@
import datetime
from typing import List, Union
import data
import polars as pl
from data import etl_property_capacities as etl_pc
from data import etl_property_capacities_daily as etl_pcd
from data import etl_property_capacities_monthly as etl_pcm
from data import etl_property_neighbours as etl_pn
from data import etl_region_capacities as etl_rc
from data import etl_region_capacities_daily as etl_rcd
from data import etl_region_capacities_monthly as etl_rcm
from data import etl_region_movAverage as etl_rmA
from data import etl_region_properties_capacities as etl_rpc
from fastapi import FastAPI
from fastapi.responses import JSONResponse
from pydantic import BaseModel
class RegionsItems(BaseModel):
name: str
id: str
count_properties: int
class Regions(BaseModel):
regions: List[RegionsItems]
class RegionBase(BaseModel):
name: str
id: str
class RegionPropertiesCapacitiesValues(BaseModel):
date: str
property_id: str
capacity: float
class RegionCapacities(BaseModel):
capacities: List[float]
dates: List
class RegionCapacitiesMonthly(BaseModel):
months: List[str]
capacities: List[float]
class RegionCapacitiesDaily(BaseModel):
weekdays: List[str]
capacities: List[float]
class RegionPropertiesCapacities(BaseModel):
dates: List
property_ids: List
values: List[RegionPropertiesCapacitiesValues]
class RegionMovingAverage(BaseModel):
dates: List
capacities_timeframe_before: List[Union[float, None]]
capacities_timeframe_after: List[Union[float, None]]
capacities_moving_average: List[Union[float, None]]
class PropertiesGrowth(BaseModel):
dates: List
total_all: List[Union[int, None]]
total_heidiland: List[Union[int, None]]
total_engadin: List[Union[int, None]]
total_stmoritz: List[Union[int, None]]
total_davos: List[Union[int, None]]
class PropertiesGeoList(BaseModel):
property_id: str
latlng: str
region_id: str
class PropertiesGeo(BaseModel):
properties: List[PropertiesGeoList]
class PropertyNeighboursList(BaseModel):
id: str
lat: float
lon: float
class PropertyNeighbours(BaseModel):
neighbours: List[PropertyNeighboursList]
class PropertyNeighboursList(BaseModel):
id: str
lat: float
lon: float
class PropertyExtractionsList(BaseModel):
calendar: str
date: str
class PropertyExtractions(BaseModel):
extractions: List[PropertyExtractionsList]
class PropertyCapacities(BaseModel):
capacities: List[float]
dates: List[str]
class PropertyCapacitiesMonthly(BaseModel):
months: List[str]
capacities: List[float]
class PropertyCapacitiesDaily(BaseModel):
weekdays: List[str]
capacities: List[float]
class PropertyBaseDetail(BaseModel):
property_platform_id: str
first_found: str
last_found: str
latlng: str
region_id: str
region_name: str
class PropertyBase(BaseModel):
property_platform_id: str
first_found: str
last_found: str
latlng: str
region_id: str
region_name: str
d = data.load()
tags_metadata = [
{
"name": "regions",
"description": "Get data for regions.",
},
{
"name": "properties",
"description": "Get data for properties",
},
]
app = FastAPI(openapi_tags=tags_metadata)
@app.get("/")
def read_root():
return {"Hi there!"}
@app.get("/regions", response_model=Regions, tags=['region'])
def regions():
"""
Returns a list of all available regions.
"""
return {"regions" : d.properties_per_region().pl().to_dicts()}
@app.get("/regions/{id}/base", response_model=RegionBase, tags=['region'])
def region_base(id: int):
"""
Returns basic information about a region.
"""
base = d.region_base_data(id).pl().to_dicts()
return {"id": base[0]["id"], "name": base[0]["name"]}
@app.get("/regions/{id}/capacities", response_model=RegionCapacities, tags=['region'])
def region_capacities(id: int):
"""
Returs the capacities of a region, for every scraping. Set id to -1 to obtain data for all regions.
"""
capacities = etl_rc.region_capacities(id)
return capacities
@app.get("/regions/{id}/capacities/monthly/{date}", response_model=RegionCapacitiesMonthly, tags=['region'])
def region_capacities_monthly(id: int, date: datetime.date):
"""
Returns the capacities of a region for specified date by months. set id to -1 to obtain data for all regions.
"""
capacities = etl_rcm.region_capacities_monthly(id, date)
return capacities
@app.get("/regions/{id}/capacities/daily/{date}", response_model=RegionCapacitiesDaily, tags=['region'])
def region_capacities_daily(id: int, date: datetime.date):
"""
Returns the capacities of a region for specified date by days. set id to -1 to obtain data for all regions.
"""
capacities = etl_rcd.region_capacities_daily(id, date)
return capacities
@app.get("/regions/{id}/moving-average/{date}", response_model=RegionMovingAverage, tags=['region'])
def region_capacities_data(id: int, date: datetime.date):
"""
Returns the moving average of a region for specified date. set id to -1 to obtain data for all regions.
"""
result = etl_rmA.region_movingAverage(id, date)
return result
@app.get("/regions/{id}/properties/capacities", response_model=RegionPropertiesCapacities, tags=['region'])
def region_property_capacities(id: int):
"""
Returns the capacities of properties in region, for every scraping. set id to -1 to obtain data for all regions.
"""
capacities = etl_rpc.region_properties_capacities(id)
return capacities
@app.get("/properties/growth", response_model=PropertiesGrowth, tags=['property'])
def properties_growth():
"""
Returns the growth rate of found properties
"""
options = {"dates" : d.properties_growth().pl()['date'].to_list(), "total_all" : d.properties_growth().pl()['total_all'].to_list(), "total_heidiland" : d.properties_growth().pl()['total_heidiland'].to_list(), "total_engadin" : d.properties_growth().pl()['total_engadin'].to_list(), "total_davos" : d.properties_growth().pl()['total_davos'].to_list(), "total_stmoritz" : d.properties_growth().pl()['total_stmoritz'].to_list()}
return options
@app.get("/properties/geo", response_model=PropertiesGeo, tags=['property'])
def properties_geo():
"""
Returns the geocoordinates of properties
"""
return {"properties": d.properties_geo().pl().to_dicts()}
@app.get("/properties/{id}/base", response_model=PropertyBase, tags=['property'])
def property_base_data(id: int):
"""
Returns basic information about a property.
"""
base = d.property_base_data(id).pl().to_dicts()
return {
"property_platform_id": base[0]['property_platform_id'],
"first_found": str(base[0]['first_found']),
"last_found": str(base[0]['last_found']),
"latlng": base[0]['latlng'],
"region_id": base[0]['region_id'],
"region_name": base[0]['region_name']}
@app.get("/properties/{id}/neighbours", response_model=PropertyNeighbours, tags=['property'])
def property_neighbours(id: int):
"""
Returns the 10 nearest properties from given property.
"""
return {"neighbours" : etl_pn.property_neighbours(id)}
@app.get("/properties/{id}/extractions", response_model=PropertyExtractions, tags=['property'])
def property_extractions(id: int):
"""
Returns extracted data from given property.
"""
return {"extractions" : d.extractions_for(property_id = id).pl().cast({"date": pl.String}).to_dicts()}
@app.get("/properties/{id}/capacities", response_model=PropertyCapacities, tags=['property'])
def property_capacities_data(id: int):
"""
Returns capacities for given property.
"""
capacities = etl_pc.property_capacities(id)
return capacities
@app.get("/properties/{id}/capacities/monthly/{date}", response_model=PropertyCapacitiesMonthly, tags=['property'])
def property_capacities_data_monthly(id: int, date: datetime.date):
"""
Returns capacities for given property and date by month.
"""
capacities = etl_pcm.property_capacities_monthly(id, date)
return capacities
@app.get("/properties/{id}/capacities/daily/{date}", response_model=PropertyCapacitiesDaily, tags=['property'])
def property_capacities_data_daily(id: int, date: datetime.date):
"""
Returns capacities for given property and date by day.
"""
capacities = etl_pcd.property_capacities_daily(id, date)
return capacities

578
etl/src/data/database.py Normal file
View File

@ -0,0 +1,578 @@
from threading import Thread, current_thread
import duckdb
class Database:
def check_duckdb_extensions(self, extension):
return self.connection.execute("""
SELECT
installed
FROM
duckdb_extensions()
WHERE
extension_name = $extension
""",
{
"extension": extension
}
).fetchone()
def __init__(self, path):
duckdb_connection = duckdb.connect(database = path, read_only=True)
self.connection = duckdb_connection.cursor()
# Install spatial extension if not already installed
spatial_installed = self.check_duckdb_extensions(extension='spatial')
if(spatial_installed and not spatial_installed[0]):
self.connection.sql("INSTALL spatial")
def db_overview(self):
return self.connection.sql("DESCRIBE;").show()
def seeds(self):
return self.connection.sql("""
SELECT
regions.name,
seeds.uri
FROM
consultancy_d.regions
LEFT JOIN
consultancy_d.seeds ON regions.id = seeds.region_id;
""").show()
def properties_growth(self):
return self.connection.sql("""
WITH PropertiesALL AS (
SELECT
strftime(created_at, '%Y-%m-%d') AS date,
COUNT(*) as properties_count,
SUM(properties_count) OVER (ORDER BY date) AS total
FROM
consultancy_d.properties p
GROUP BY
date
ORDER BY
date
),
PropertiesR1 AS (
SELECT
strftime(created_at, '%Y-%m-%d') AS date,
COUNT(*) as properties_count,
SUM(properties_count) OVER (ORDER BY date) AS total
FROM
consultancy_d.properties p
WHERE
p.seed_id = 1
GROUP BY
date
ORDER BY
date
),
PropertiesR2 AS (
SELECT
strftime(created_at, '%Y-%m-%d') AS date,
COUNT(*) as properties_count,
SUM(properties_count) OVER (ORDER BY date) AS total
FROM
consultancy_d.properties p
WHERE
p.seed_id = 2
GROUP BY
date
ORDER BY
date
),
PropertiesR3 AS (
SELECT
strftime(created_at, '%Y-%m-%d') AS date,
COUNT(*) as properties_count,
SUM(properties_count) OVER (ORDER BY date) AS total
FROM
consultancy_d.properties p
WHERE
p.seed_id = 3
GROUP BY
date
ORDER BY
date
),
PropertiesR4 AS (
SELECT
strftime(created_at, '%Y-%m-%d') AS date,
COUNT(*) as properties_count,
SUM(properties_count) OVER (ORDER BY date) AS total
FROM
consultancy_d.properties p
WHERE
p.seed_id = 4
GROUP BY
date
ORDER BY
date
)
SELECT
p.date,
p.total AS total_all,
pR1.total as total_heidiland,
pR2.total AS total_davos,
pR3.total AS total_engadin,
pR4.total AS total_stmoritz
FROM
PropertiesAll p
LEFT JOIN
PropertiesR1 pR1 ON p.date = pR1.date
LEFT JOIN
PropertiesR2 pR2 ON p.date = pR2.date
LEFT JOIN
PropertiesR3 pR3 ON p.date = pR3.date
LEFT JOIN
PropertiesR4 pR4 ON p.date = pR4.date
ORDER BY
p.date
""")
def properties_per_region(self):
return self.connection.sql("""
SELECT
regions.name,
regions.id,
COUNT(*) AS count_properties
FROM
consultancy_d.properties
LEFT JOIN
consultancy_d.seeds ON seeds.id = properties.seed_id
LEFT JOIN
consultancy_d.regions ON regions.id = seeds.region_id
GROUP BY
properties.seed_id,
regions.name,
regions.id
ORDER BY
count_properties ASC
""")
def propIds_with_region(self):
return self.connection.sql("""
SELECT
properties.id, seed_id, regions.name
FROM
consultancy_d.properties
LEFT JOIN
consultancy_d.seeds ON seeds.id = properties.seed_id
LEFT JOIN
consultancy_d.regions ON regions.id = seeds.region_id
""")
def properties_unreachable(self):
return self.connection.sql("""
SELECT
entity_id,
strftime(properties.created_at, '%Y-%m-%d') AS first_found,
strftime(properties.last_found, '%Y-%m-%d') AS last_found
FROM
consultancy_d.exceptions
LEFT JOIN
consultancy_d.properties ON properties.id = exceptions.entity_id
WHERE
JSON_VALID(exception) = true AND
JSON_EXTRACT(exception, '$.status') = '404'
GROUP BY ALL
ORDER BY
last_found
""").show()
def properties_not_found(self):
return self.connection.sql("""
SELECT
COUNT(entity_id) as count_props,
strftime(created_at, '%Y-%m-%d') as date
FROM
consultancy_d.exceptions
WHERE
JSON_VALID(exception) = true AND
JSON_EXTRACT(exception, '$.status') > 400
GROUP BY
date
""").show()
def properties_distance(self):
return self.connection.sql("""
LOAD spatial;
CREATE OR REPLACE VIEW geolocation_changes AS
SELECT
exceptions.entity_id,
properties.check_data AS geolocation_original,
SUBSTRING(exceptions.exception, 28) AS geolocation_new,
ST_Distance_Sphere(
ST_GeomFromText(
CONCAT(
'POINT(',
REPLACE(properties.check_data, ',', ' '),
')'
)
),
ST_GeomFromText(
CONCAT(
'POINT(',
REPLACE(SUBSTRING(exceptions.exception, 28), ',', ' '),
')'
)
)
) AS distance
FROM
consultancy_d.exceptions
LEFT JOIN
consultancy_d.properties ON exceptions.entity_id = properties.id
WHERE
exception LIKE 'geoLocation was different%'
GROUP BY
entity_id,
check_data,
geolocation_new
ORDER BY
distance;
SELECT * FROM geolocation_changes;
SELECT
'0 bis 25' AS category,
COUNT(*) as count_properties
FROM
geolocation_changes
WHERE
distance >= (0)
AND distance < (25)
UNION
SELECT
'25 bis 50' AS category,
COUNT(*) as count_properties
FROM
geolocation_changes
WHERE
distance >= (25)
AND distance < (50)
UNION
SELECT
'50 bis 75' AS category,
COUNT(*) as count_properties
FROM
geolocation_changes
WHERE
distance >= (50)
AND distance < (75)
UNION
SELECT
'75 bis 100' AS category,
COUNT(*) as count_properties
FROM
geolocation_changes
WHERE
distance >= (75)
AND distance < (100);
""")
def properties_exceptions(self):
return self.connection.sql("""
SELECT
JSON_EXTRACT(exception, '$.status') AS exception_status,
COUNT(JSON_EXTRACT(exception, '$.status')) AS exception_count
FROM
consultancy_d.exceptions
WHERE
type != 'property'
GROUP BY
JSON_EXTRACT(exception, '$.status')
""")
def extractions(self):
return self.connection.sql("""
SELECT
JSON_EXTRACT(body, '$.content.days') as calendar,
property_id,
created_at
FROM
consultancy_d.extractions
WHERE
type == 'calendar' AND
calendar NOT NULL
ORDER BY
property_id
""")
def extractions_with_region(self):
return self.connection.sql("""
SELECT
JSON_EXTRACT(body, '$.content.days') as calendar,
extractions.property_id,
extractions.created_at,
properties.seed_id,
regions.name
FROM
consultancy_d.extractions
LEFT JOIN
consultancy_d.properties ON properties.id = extractions.property_id
LEFT JOIN
consultancy_d.seeds ON seeds.id = properties.seed_id
LEFT JOIN
consultancy_d.regions ON regions.id = seeds.region_id
WHERE
calendar NOT NULL
""")
def extractions_for(self, property_id):
return self.connection.sql(f"""
SELECT
JSON_EXTRACT(body, '$.content.days') as calendar,
created_at as date
FROM
consultancy_d.extractions
WHERE
type == 'calendar' AND
property_id = {property_id} AND
calendar NOT NULL
ORDER BY
created_at
""")
def extractions_propId_scrapeDate(self, property_id: int, scrape_date: str):
return self.connection.sql(f"""
SELECT
JSON_EXTRACT(body, '$.content.days') as calendar,
created_at
FROM
consultancy_d.extractions
WHERE
type == 'calendar' AND
property_id = {property_id} AND
calendar NOT NULL AND
created_at >= '{scrape_date}'
ORDER BY
created_at
LIMIT 1
""")
# Anzahl der extrahierten properties pro Exktraktionsvorgang
def properties_per_extraction(self, property_id):
return self.connection.sql("""
SELECT
COUNT(property_id),
strftime(created_at, '%Y-%m-%d') AS date
FROM
consultancy_d.extractions
WHERE
type == 'calendar'
GROUP BY
date
ORDER BY date ASC
""")
def price(self):
return self.connection.sql("""
SELECT
JSON_EXTRACT(body, '$.content.lowestPrice.valueWeekRaw') AS pricePerWeek,
JSON_EXTRACT(body, '$.content.lowestPrice.valueNightRaw') AS pricePerNight,
JSON_EXTRACT(body, '$.content.lowestPrice.currency') AS currency,
property_id,
created_at
FROM
consultancy_d.extractions
WHERE
type == 'price'
ORDER BY property_id
""")
def price_developement_per_property(self):
return self.connection.sql("""
SELECT
JSON_EXTRACT(body, '$.content.lowestPrice.valueNightRaw') AS pricePerNight,
property_id,
created_at
FROM
consultancy_d.extractions
WHERE
type == 'price'
ORDER BY property_id
""")
def property_base_data(self, id):
return self.connection.sql(f"""
SELECT
p.property_platform_id,
p.created_at as first_found,
p.last_found,
p.check_data as latlng,
r.id as region_id,
r.name as region_name
FROM
consultancy_d.properties p
INNER JOIN consultancy_d.seeds s ON s.id = p.seed_id
INNER JOIN consultancy_d.regions r ON s.region_id = r.id
WHERE
p.id = {id}
""")
def region_base_data(self, id):
if id == -1:
where = ''
else:
where = f"WHERE r.id = {id}"
return self.connection.sql(f"""
SELECT
r.id as id,
r.name as name
FROM
consultancy_d.regions r
{where}
""")
def properties_geo(self):
return self.connection.sql("""
SELECT
p.id as property_id,
p.check_data as latlng,
r.id as region_id
FROM
consultancy_d.properties p
LEFT JOIN
consultancy_d.seeds s ON s.id = p.seed_id
LEFT JOIN
consultancy_d.regions r ON r.id = s.region_id
""")
def properties_geo_seeds(self):
return self.connection.sql("""
SELECT
p.id,
p.seed_id,
p.check_data as coordinates
FROM
consultancy_d.properties p
""")
def capacity_of_region(self, region_id):
return self.connection.sql(f"""
SELECT
JSON_EXTRACT(body, '$.content.days') as calendarBody,
strftime(extractions.created_at, '%Y-%m-%d') AS ScrapeDate,
extractions.property_id,
FROM
consultancy_d.extractions
LEFT JOIN
consultancy_d.properties ON properties.id = extractions.property_id
WHERE
type == 'calendar' AND
properties.seed_id = {region_id} AND
calendarBody NOT NULL
""")
def singleScrape_of_region(self, region_id: int, scrape_date_min: str, scrape_date_max: str):
return self.connection.sql(f"""
SELECT
JSON_EXTRACT(body, '$.content.days') as calendarBody,
FROM
consultancy_d.extractions
LEFT JOIN
consultancy_d.properties ON properties.id = extractions.property_id
WHERE
type == 'calendar' AND
properties.seed_id = {region_id} AND
extractions.created_at >= '{scrape_date_min}' AND
extractions.created_at < '{scrape_date_max}' AND
calendarBody NOT NULL
""")
def singleScrape_of_global(self, scrape_date_min: str, scrape_date_max: str):
return self.connection.sql(f"""
SELECT
JSON_EXTRACT(body, '$.content.days') as calendarBody,
FROM
consultancy_d.extractions
LEFT JOIN
consultancy_d.properties ON properties.id = extractions.property_id
WHERE
type == 'calendar' AND
extractions.created_at >= '{scrape_date_min}' AND
extractions.created_at < '{scrape_date_max}' AND
calendarBody NOT NULL
""")
def singleScrape_of_region_scrapDate(self, region_id: int, scrape_date_min: str, scrape_date_max: str):
return self.connection.sql(f"""
SELECT
JSON_EXTRACT(body, '$.content.days') as calendarBody,
extractions.created_at
FROM
consultancy_d.extractions
LEFT JOIN
consultancy_d.properties ON properties.id = extractions.property_id
WHERE
type == 'calendar' AND
properties.seed_id = {region_id} AND
extractions.created_at >= '{scrape_date_min}' AND
extractions.created_at < '{scrape_date_max}' AND
calendarBody NOT NULL
""")
def singleScrape_of_global_scrapDate(self, scrape_date_min: str, scrape_date_max: str):
return self.connection.sql(f"""
SELECT
JSON_EXTRACT(body, '$.content.days') as calendarBody,
extractions.created_at
FROM
consultancy_d.extractions
LEFT JOIN
consultancy_d.properties ON properties.id = extractions.property_id
WHERE
type == 'calendar' AND
extractions.created_at >= '{scrape_date_min}' AND
extractions.created_at < '{scrape_date_max}' AND
calendarBody NOT NULL
""")
def capacity_global(self):
return self.connection.sql(f"""
SELECT
JSON_EXTRACT(body, '$.content.days') as calendarBody,
strftime(extractions.created_at, '%Y-%m-%d') AS ScrapeDate,
extractions.property_id,
FROM
consultancy_d.extractions
LEFT JOIN
consultancy_d.properties ON properties.id = extractions.property_id
WHERE
type == 'calendar'
AND
calendarBody NOT NULL
""")
def capacity_comparison_of_region(self, region_id_1, region_id_2):
return self.connection.sql(f"""
SELECT
JSON_EXTRACT(body, '$.content.days') as calendarBody,
strftime(extractions.created_at, '%Y-%m-%d') AS ScrapeDate,
extractions.property_id,
properties.seed_id
FROM
consultancy_d.extractions
LEFT JOIN
consultancy_d.properties ON properties.id = extractions.property_id
WHERE
type == 'calendar' AND
(properties.seed_id = {region_id_1} OR
properties.seed_id = {region_id_2}) AND
calendarBody NOT NULL
""")
def unique_scrapeDates(self):
return self.connection.sql(f"""
SELECT DISTINCT
strftime(extractions.created_at, '%Y-%m-%d') AS ScrapeDate,
FROM
consultancy_d.extractions
""")

18
etl/src/data/etl_cache.py Normal file
View File

@ -0,0 +1,18 @@
from pathlib import Path
from pickle import dump, load
Path('cache').mkdir(parents=True, exist_ok=True)
# load pickle obj
def openObj(file):
filepath = Path(f"cache/{file}")
if filepath.is_file():
with open(filepath, 'rb') as f:
return load(f)
return False
# save pickle obj
def saveObj(file, result):
filepath = Path(f"cache/{file}")
with open(filepath, 'wb') as f:
dump(result, f)

View File

@ -0,0 +1,46 @@
from io import StringIO
import polars as pl
import data
from data import etl_cache
d = data.load()
def property_capacities(id: int):
file = f"etl_property_capacities_{id}.obj"
obj = etl_cache.openObj(file)
if obj:
return obj
extractions = d.extractions_for(id).pl()
df_dates = pl.DataFrame()
for row in extractions.rows(named=True):
df_calendar = pl.read_json(StringIO(row['calendar']))
#df_calendar.insert_column(0, pl.Series("created_at", [row['created_at']]))
df_dates = pl.concat([df_calendar, df_dates], how="diagonal")
# order = sorted(df_dates.columns)
# df_dates = df_dates.select(order)
sum_hor = df_dates.sum_horizontal()
#print(sum_hor)
# Get the available dates per extraction
count_days = []
for dates in df_dates.rows():
# Remove all None values
liste = [x for x in dates if x is not None]
count_days.append(len(liste))
counts = pl.DataFrame({"count_days" : count_days, "sum" : sum_hor})
result = {"capacities": [], "dates": extractions['date'].cast(pl.Date).cast(pl.String).to_list() }
for row in counts.rows(named=True):
max_capacity = row['count_days'] * 2
max_capacity_perc = 100 / max_capacity
result['capacities'].append(round(max_capacity_perc * row['sum'], 2))
result['capacities'].reverse()
etl_cache.saveObj(file, result)
return result

View File

@ -0,0 +1,41 @@
from io import StringIO
import polars as pl
import data
from data import etl_cache
d = data.load()
def property_capacities_daily(id: int, scrapeDate: str):
file = f"etl_property_capacities_weekdays_{id}_{scrapeDate}.obj"
obj = etl_cache.openObj(file)
if obj:
return obj
extractions = d.extractions_propId_scrapeDate(id, scrapeDate).pl()
weekdays = ['Monday', 'Tuesday', 'Wednesday', 'Thursday', 'Friday', 'Saturday', 'Sunday']
df_calendar = pl.DataFrame()
numWeeks = 0
for row in extractions.rows(named=True):
scrapeDate = row['created_at']
df_calendar = pl.read_json(StringIO(row['calendar']))
columnTitles = df_calendar.columns
df_calendar = df_calendar.transpose()
df_calendar = df_calendar.with_columns(pl.Series(name="dates", values=columnTitles))
df_calendar = df_calendar.with_columns((pl.col("dates").str.to_date()))
numWeeks = round((df_calendar.get_column("dates").max() - df_calendar.get_column("dates").min()).days / 7, 0)
df_calendar = df_calendar.with_columns(pl.col("dates").dt.weekday().alias("weekday_num"))
df_calendar = df_calendar.with_columns(pl.col("dates").dt.strftime("%A").alias("weekday"))
df_calendar = df_calendar.drop("dates")
df_calendar = df_calendar.group_by(["weekday", "weekday_num"]).agg(pl.col("column_0").sum())
df_calendar = df_calendar.with_columns((pl.col("column_0") / numWeeks / 2 * 100).alias("column_0"))
df_calendar = df_calendar.sort('weekday_num')
df_calendar = df_calendar.drop('weekday_num')
result = {"date": scrapeDate, "weekdays": df_calendar['weekday'].to_list(), 'capacities': df_calendar['column_0'].to_list()}
etl_cache.saveObj(file, result)
return result

View File

@ -0,0 +1,38 @@
from io import StringIO
import polars as pl
import data
from data import etl_cache
d = data.load()
def property_capacities_monthly(id: int, scrapeDate: str):
file = f"etl_property_capacities_monthly_{id}_{scrapeDate}.obj"
obj = etl_cache.openObj(file)
if obj:
return obj
extractions = d.extractions_propId_scrapeDate(id, scrapeDate).pl()
df_calendar = pl.DataFrame()
for row in extractions.rows(named=True):
scrapeDate = row['created_at']
df_calendar = pl.read_json(StringIO(row['calendar']))
columnTitles = df_calendar.columns
df_calendar = df_calendar.transpose()
df_calendar = df_calendar.with_columns(pl.Series(name="dates", values=columnTitles))
df_calendar = df_calendar.with_columns((pl.col("dates").str.to_date()))
df_calendar = df_calendar.with_columns((pl.col("dates").dt.month_end().dt.day().alias('numDays')))
df_calendar = df_calendar.with_columns((pl.col("dates").dt.strftime("%b") + " " + (pl.col("dates").dt.strftime("%Y"))).alias('date_short'))
df_calendar = df_calendar.with_columns((pl.col("dates").dt.strftime("%Y") + " " + (pl.col("dates").dt.strftime("%m"))).alias('dates'))
df_calendar = df_calendar.group_by(['dates', 'date_short', 'numDays']).agg(pl.col("column_0").sum())
df_calendar = df_calendar.with_columns((pl.col("column_0") / pl.col("numDays") / 2 * 100).alias("column_0"))
df_calendar = df_calendar.sort('dates')
result = {"months": df_calendar['date_short'].to_list(), 'capacities': df_calendar['column_0'].to_list()}
etl_cache.saveObj(file, result)
return result

View File

@ -0,0 +1,73 @@
from math import asin, atan2, cos, degrees, radians, sin, sqrt
import polars as pl
import data
from data import etl_cache
d = data.load()
def calcHaversinDistance(latMain, lonMain, lat, lon):
R = 6371
# convert decimal degrees to radians
latMain, lonMain, lat, lon = map(radians, [latMain, lonMain, lat, lon])
# haversine formula
dlon = lonMain - lon
dlat = latMain - lat
a = sin(dlat / 2) ** 2 + cos(lat) * cos(latMain) * sin(dlon / 2) ** 2
c = 2 * atan2(sqrt(a), sqrt(1-a))
d = R * c
return d
def property_neighbours(id: int):
file = f"etl_property_neighbours_{id}.obj"
obj = etl_cache.openObj(file)
if obj:
return obj
extractions = d.properties_geo_seeds().pl()
# Get lat, long and region from main property
latMain, lonMain = extractions.filter(pl.col('id') == str(id))['coordinates'][0].split(',')
latMain, lonMain = map(float, [latMain, lonMain])
region = extractions.filter(pl.col('id') == str(id))['seed_id'][0]
# Prefilter the dataframe to only the correct region
extractions = extractions.filter(pl.col('seed_id') == str(region))
extractions = extractions.drop('seed_id')
# Remove main property from DF
extractions = extractions.filter(pl.col('id') != str(id))
# Split coordinate into lat and lon
extractions = extractions.with_columns(pl.col("coordinates").str.split_exact(",", 1).struct.rename_fields(["lat", "lon"]).alias("lat/lon")).unnest("lat/lon")
extractions = extractions.drop('coordinates')
extractions = extractions.with_columns(pl.col("lat").cast(pl.Float32))
extractions = extractions.with_columns(pl.col("lon").cast(pl.Float32))
# Calculate distances
distances = []
for row in extractions.rows(named=True):
lat = row['lat']
lon = row['lon']
dist = calcHaversinDistance(latMain, lonMain, lat, lon)
distances.append(dist)
# Add distance to DF
extractions = extractions.with_columns(pl.Series(name="distances", values=distances))
# Sort for distance and give only first 10
extractions = extractions.sort("distances").head(10)
extractions = extractions.drop('distances')
result = extractions.to_dicts()
etl_cache.saveObj(file, result)
return result

View File

@ -0,0 +1,58 @@
from datetime import date
from io import StringIO
import polars as pl
import data
from data import etl_cache
d = data.load()
def region_capacities(id: int):
file = f"etl_region_capacities_{id}.obj"
obj = etl_cache.openObj(file)
if obj:
return obj
# Get Data
if id == -1:
extractions = d.capacity_global().pl()
else:
extractions = d.capacity_of_region(id).pl()
# turn PropertyIDs to ints for sorting
extractions = extractions.cast({"property_id": int})
extractions.drop('property_id')
df_dates = pl.DataFrame()
# Get Data from JSON
gridData = pl.DataFrame(schema=[("scrape_date", pl.String), ("sum_hor", pl.Int64), ("calendar_width", pl.Int64)])
dayCounts = []
for row in extractions.rows(named=True):
# Return 0 for sum if calendar is null
if row['calendarBody']:
calDF = pl.read_json(StringIO(row['calendarBody']))
sum_hor = calDF.sum_horizontal()[0]
else:
sum_hor = 0
gridData = gridData.vstack(pl.DataFrame({"scrape_date" : row['ScrapeDate'], "sum_hor": sum_hor, "calendar_width": calDF.width}))
# Create Aggregates of values
df_count = gridData.group_by("scrape_date").agg(pl.col("sum_hor").count())
df_sum = gridData.group_by("scrape_date").agg(pl.col("sum_hor").sum())
df_numDays = gridData.group_by("scrape_date").agg(pl.col("calendar_width").max())
# Join and rename DF's
df = df_sum.join(df_count, on= 'scrape_date').join(df_numDays, on= 'scrape_date')
# Calculate normed capacities for each scrapeDate
df = df.with_columns((pl.col("sum_hor") / pl.col("sum_hor_right") / (pl.col("calendar_width")*2) * 100).alias("capacity"))
# Sort the date column
df = df.cast({"scrape_date": date}).sort('scrape_date')
result = {"capacities": df['capacity'].to_list(), "dates": df['scrape_date'].to_list()}
etl_cache.saveObj(file, result)
return result

View File

@ -0,0 +1,64 @@
from datetime import datetime, timedelta
from io import StringIO
import polars as pl
import data
from data import etl_cache
d = data.load()
def region_capacities_daily(id: int, scrapeDate_start: str):
file = f"etl_region_capacities_weekdays_{id}_{scrapeDate_start}.obj"
obj = etl_cache.openObj(file)
if obj:
return obj
# Get end date of start search-window
scrapeDate_end = scrapeDate_start + timedelta(days=1)
# Get Data
if id == -1:
extractions = d.singleScrape_of_global_scrapDate(scrapeDate_start, scrapeDate_end).pl()
else:
extractions = d.singleScrape_of_region_scrapDate(id, scrapeDate_start, scrapeDate_end).pl()
df_calendar = pl.DataFrame()
numWeeks = 0
firstExe = True
counter = 0
for row in extractions.rows(named=True):
scrapeDate = row['created_at']
if row['calendarBody']:
counter += 1
df_calendar = pl.read_json(StringIO(row['calendarBody']))
columnTitles = df_calendar.columns
df_calendar = df_calendar.transpose()
df_calendar = df_calendar.with_columns(pl.Series(name="dates", values=columnTitles))
df_calendar = df_calendar.with_columns((pl.col("dates").str.to_date()))
numWeeks = round((df_calendar.get_column("dates").max() - df_calendar.get_column("dates").min()).days / 7, 0)
df_calendar = df_calendar.with_columns(pl.col("dates").dt.weekday().alias("weekday_num"))
df_calendar = df_calendar.with_columns(pl.col("dates").dt.strftime("%A").alias("weekday"))
df_calendar = df_calendar.drop("dates")
df_calendar = df_calendar.group_by(["weekday", "weekday_num"]).agg(pl.col("column_0").sum())
df_calendar = df_calendar.with_columns((pl.col("column_0") / numWeeks / 2 * 100).alias("column_0"))
df_calendar = df_calendar.sort('weekday_num')
df_calendar = df_calendar.drop('weekday_num')
df_calendar = df_calendar.rename({'column_0': str(counter)})
if firstExe:
outDf = df_calendar
firstExe = False
else:
outDf = outDf.join(df_calendar, on='weekday')
# Calculate horizontal Mean
means = outDf.mean_horizontal()
outDf = outDf.insert_column(1, means)
outDf = outDf[['weekday', 'mean']]
result = {"weekdays": outDf['weekday'].to_list(),'capacities': outDf['mean'].to_list()}
etl_cache.saveObj(file, result)
return result

View File

@ -0,0 +1,65 @@
from datetime import datetime, timedelta
from io import StringIO
import polars as pl
import data
from data import etl_cache
d = data.load()
def region_capacities_monthly(id: int, scrapeDate_start: str):
file = f"etl_region_capacities_monthly_{id}_{scrapeDate_start}.obj"
obj = etl_cache.openObj(file)
if obj:
return obj
# Get end date of start search-window
scrapeDate_end = scrapeDate_start + timedelta(days=1)
# Get Data
if id == -1:
extractions = d.singleScrape_of_global_scrapDate(scrapeDate_start, scrapeDate_end).pl()
else:
extractions = d.singleScrape_of_region_scrapDate(id, scrapeDate_start, scrapeDate_end).pl()
df_calendar = pl.DataFrame()
numWeeks = 0
firstExe = True
counter = 0
for row in extractions.rows(named=True):
scrapeDate = row['created_at']
if row['calendarBody']:
counter += 1
df_calendar = pl.read_json(StringIO(row['calendarBody']))
columnTitles = df_calendar.columns
df_calendar = df_calendar.transpose()
df_calendar = df_calendar.with_columns(pl.Series(name="dates", values=columnTitles))
df_calendar = df_calendar.with_columns((pl.col("dates").str.to_date()))
df_calendar = df_calendar.with_columns((pl.col("dates").dt.month_end().dt.day().alias('numDays')))
df_calendar = df_calendar.with_columns((pl.col("dates").dt.strftime("%b") + " " + (pl.col("dates").dt.strftime("%Y"))).alias('date_short'))
df_calendar = df_calendar.with_columns((pl.col("dates").dt.strftime("%Y") + " " + (pl.col("dates").dt.strftime("%m"))).alias('dates'))
df_calendar = df_calendar.group_by(['dates', 'date_short','numDays']).agg(pl.col("column_0").sum())
df_calendar = df_calendar.with_columns((pl.col("column_0") / pl.col("numDays") / 2 * 100).alias("column_0"))
df_calendar = df_calendar.sort('dates')
df_calendar = df_calendar.drop('dates')
df_calendar = df_calendar.drop('numDays')
df_calendar = df_calendar.rename({'column_0': str(counter)})
if firstExe:
outDf = df_calendar
firstExe = False
else:
outDf = outDf.join(df_calendar, on='date_short')
# Calculate horizontal Mean
means = outDf.mean_horizontal()
outDf = outDf.insert_column(1, means)
outDf = outDf[['date_short', 'mean']]
result = {"date": scrapeDate, "months": outDf['date_short'].to_list(),'capacities': outDf['mean'].to_list()}
etl_cache.saveObj(file, result)
return result

View File

@ -0,0 +1,136 @@
from datetime import date, datetime, timedelta
from io import StringIO
import polars as pl
import data
from data import etl_cache
d = data.load()
def region_movingAverage(id: int, scrape_date_start_min: datetime.date):
file = f"etl_region_movingAverage_{id}_{scrape_date_start_min}.obj"
obj = etl_cache.openObj(file)
if obj:
return obj
# Settings
# Offset between actual and predict ScrapeDate
timeOffset = 30
# Calculation Frame
calcFrame = 180
# Filter Setting
windowSize = 7
# Get unique ScrapeDates
uniqueScrapeDates = d.unique_scrapeDates().pl()
uniqueScrapeDates = uniqueScrapeDates.get_column('ScrapeDate').str.to_date()
uniqueScrapeDates = uniqueScrapeDates.sort().to_list()
# Get end date of start search-window
scrape_date_start_max = scrape_date_start_min + timedelta(days=1)
# Get start and end date of End search-window
scrape_date_end_min = scrape_date_start_min + timedelta(days=timeOffset)
# Get closest ScrapeDate
scrape_date_end_min = min(uniqueScrapeDates, key=lambda x: abs(x - scrape_date_end_min))
scrape_date_end_max = scrape_date_end_min + timedelta(days=1)
final_end_date = scrape_date_end_min + timedelta(days=calcFrame)
# Get Data
if id == -1:
ex_start = d.singleScrape_of_global(scrape_date_start_min, scrape_date_start_max)
ex_start_count = ex_start.shape[0]
ex_end = d.singleScrape_of_global(scrape_date_end_min, scrape_date_end_max)
ex_end_count = ex_end.shape[0]
else:
ex_start = d.singleScrape_of_region(id, scrape_date_start_min, scrape_date_start_max)
ex_start_count = ex_start.shape[0]
ex_end = d.singleScrape_of_region(id, scrape_date_end_min, scrape_date_end_max)
ex_end_count = ex_end.shape[0]
num_properties = [ex_start_count, ex_end_count]
start_end = [ex_start, ex_end]
outDFList = []
for df in start_end:
df = df.pl()
firstExe = True
counter = 1
outDF = pl.DataFrame(schema={"0": int, "dates": date})
for row in df.rows(named=True):
if row['calendarBody']:
calDF = pl.read_json(StringIO(row['calendarBody']))
columnTitles = calDF.columns
calDF = calDF.transpose()
calDF = calDF.with_columns(pl.Series(name="dates", values=columnTitles))
calDF = calDF.with_columns((pl.col("dates").str.to_date()))
# Filter out all Data that's in the calculation frame
calDF = calDF.filter((pl.col("dates") >= (scrape_date_start_min + timedelta(days=1))))
calDF = calDF.filter((pl.col("dates") < final_end_date))
# Join all information into one Dataframe
if firstExe:
outDF = calDF
firstExe = False
else:
outDF = outDF.join(calDF, on='dates')
outDF = outDF.rename({'column_0': str(counter)})
counter += 1
outDF = outDF.sort('dates')
outDFList.append(outDF)
# Calculate the horizontal Sum for all Dates
arrayCunter = 0
tempDFList = []
for df in outDFList:
dates = df.select(pl.col("dates"))
values = df.select(pl.exclude("dates"))
sum_hor = values.sum_horizontal()
sum_hor = sum_hor / num_properties[arrayCunter] / 2 * 100
arrayCunter += 1
newDF = dates.with_columns(sum_hor=pl.Series(sum_hor))
tempDFList.append(newDF)
# Join actual and predict Values
outDF = tempDFList[0].join(tempDFList[1], on='dates', how='outer')
# Rename Columns for clarity
outDF = outDF.drop('dates_right')
# sum_hor_predict is the data from the earlier ScrapeDate
outDF = outDF.rename({'sum_hor_right': 'sum_hor_actual', 'sum_hor': 'sum_hor_predict'})
# Calculate Moving average from Start
baseValues = outDF.get_column('sum_hor_predict').to_list()
i = 0
moving_averages = []
while i < len(baseValues) - windowSize + 1:
window = baseValues[i: i + windowSize]
window_average = sum(window) / windowSize
moving_averages.append(window_average)
i += 1
# Add empty values back to the front and end of moving_averages
num_empty = int(windowSize / 2)
moving_averages = [None] *num_empty + moving_averages + [None] * num_empty
# Add moving_averages to df
outDF = outDF.with_columns(moving_averages=pl.Series(moving_averages))
result = {'dates': outDF.get_column('dates').to_list(), 'capacities_timeframe_before': outDF.get_column('sum_hor_predict').to_list(), 'capacities_timeframe_after':outDF.get_column('sum_hor_actual').to_list(), 'capacities_moving_average':outDF.get_column('moving_averages').to_list(),}
etl_cache.saveObj(file, result)
return result

View File

@ -0,0 +1,64 @@
from io import StringIO
import polars as pl
import data
from data import etl_cache
d = data.load()
def region_properties_capacities(id: int):
file = f"etl_region_properties_capacities_{id}.obj"
obj = etl_cache.openObj(file)
if obj:
return obj
# Get Data
if id == -1:
df = d.capacity_global().pl()
else:
df = d.capacity_of_region(id).pl()
# turn PropertyIDs to ints for sorting
df = df.cast({"property_id": int})
# Get uniques for dates and propIDs and sort them
listOfDates = df.get_column("ScrapeDate").unique().sort()
listOfPropertyIDs = df.get_column("property_id").unique().sort()
# Create DFs from lists to merge later
datesDF = pl.DataFrame(listOfDates).with_row_index("date_index")
propIdDF = pl.DataFrame(listOfPropertyIDs).with_row_index("prop_index")
# Merge Dataframe to generate indices
df = df.join(datesDF, on='ScrapeDate')
df = df.join(propIdDF, on='property_id')
# Calculate grid values
gridData = pl.DataFrame(schema=[("scrape_date", pl.String), ("property_id", pl.String), ("sum_hor", pl.Int64)])
for row in df.rows(named=True):
# Return 0 for sum if calendar is null
if row['calendarBody']:
calDF = pl.read_json(StringIO(row['calendarBody']))
sum_hor = calDF.sum_horizontal()[0]
else:
sum_hor = 0
gridData = gridData.vstack(pl.DataFrame({"scrape_date" : row['ScrapeDate'], "property_id": str(row['property_id']), "sum_hor": sum_hor}))
# get the overall maximum sum
maxValue = gridData['sum_hor'].max()
values = []
for row in gridData.rows(named=True):
capacity = (row['sum_hor']*100)/maxValue
values.append({"date" : row['scrape_date'], "property_id": row['property_id'], "capacity": capacity})
# Cast listOfDates to datetime
listOfDates = listOfDates.cast(pl.Date).to_list()
listOfPropertyIDs = listOfPropertyIDs.cast(pl.String).to_list()
# Create JSON
outDict = {'dates': listOfDates, 'property_ids': listOfPropertyIDs, 'values': values}
etl_cache.saveObj(file, outDict)
return outDict

0
etl/src/etl/__init__.py Normal file
View File

View File

@ -1,22 +0,0 @@
from typing import Union
import polars as pl
from fastapi import FastAPI, Response
import data
d = data.load()
app = FastAPI()
@app.get("/")
def read_root():
return {"Hello": "World"}
@app.get("/items/{item_id}")
def read_item(item_id: int):
ext = d.extractions_for(item_id).pl()
out = ext.with_columns(pl.col("calendar").str.extract_all(r"([0-9]{4}-[0-9]{2}-[0-9]{2})|[0-2]").alias("calendar_data"))
out = out.drop(['calendar', 'property_id'])
return Response(content=out.write_json(), media_type="application/json")

View File

@ -1,269 +0,0 @@
from threading import Thread, current_thread
import duckdb
class Database:
def check_duckdb_extensions(self, extension):
return self.connection.execute("""
SELECT
installed
FROM
duckdb_extensions()
WHERE
extension_name = $extension
""",
{
"extension": extension
}
).fetchone()
def __init__(self, path):
duckdb_connection = duckdb.connect(database = path, read_only=True)
self.connection = duckdb_connection.cursor()
# Install spatial extension if not already installed
spatial_installed = self.check_duckdb_extensions(extension='spatial')
if(spatial_installed and not spatial_installed[0]):
self.connection.sql("INSTALL spatial")
def db_overview(self):
return self.connection.sql("DESCRIBE;").show()
def seeds(self):
return self.connection.sql("""
SELECT
regions.name,
seeds.uri
FROM
consultancy_d.regions
LEFT JOIN
consultancy_d.seeds ON regions.id = seeds.region_id;
""").show()
def properties_growth(self):
return self.connection.sql("""
SELECT
strftime(created_at, '%Y-%m-%d') AS date,
COUNT(*) as properties_count
FROM
consultancy_d.properties
GROUP BY
date;
""")
def properties_per_region(self):
return self.connection.sql("""
SELECT
regions.name,
COUNT(*) AS count_properties
FROM
consultancy_d.properties
LEFT JOIN
consultancy_d.seeds ON seeds.id = properties.seed_id
LEFT JOIN
consultancy_d.regions ON regions.id = seeds.region_id
GROUP BY
properties.seed_id,
regions.name
""")
def properties_unreachable(self):
return self.connection.sql("""
SELECT
entity_id,
strftime(properties.created_at, '%Y-%m-%d') AS first_found,
strftime(properties.last_found, '%Y-%m-%d') AS last_found
FROM
consultancy_d.exceptions
LEFT JOIN
consultancy_d.properties ON properties.id = exceptions.entity_id
WHERE
JSON_VALID(exception) = true AND
JSON_EXTRACT(exception, '$.status') = '404'
GROUP BY ALL
ORDER BY
last_found
""").show()
def properties_not_found(self):
return self.connection.sql("""
SELECT
COUNT(entity_id) as count_props,
strftime(created_at, '%Y-%m-%d') as date
FROM
consultancy_d.exceptions
WHERE
JSON_VALID(exception) = true AND
JSON_EXTRACT(exception, '$.status') > 400
GROUP BY
date
""").show()
def properties_distance(self):
return self.connection.sql("""
LOAD spatial;
CREATE OR REPLACE VIEW geolocation_changes AS
SELECT
exceptions.entity_id,
properties.check_data AS geolocation_original,
SUBSTRING(exceptions.exception, 28) AS geolocation_new,
ST_Distance_Sphere(
ST_GeomFromText(
CONCAT(
'POINT(',
REPLACE(properties.check_data, ',', ' '),
')'
)
),
ST_GeomFromText(
CONCAT(
'POINT(',
REPLACE(SUBSTRING(exceptions.exception, 28), ',', ' '),
')'
)
)
) AS distance
FROM
consultancy_d.exceptions
LEFT JOIN
consultancy_d.properties ON exceptions.entity_id = properties.id
WHERE
exception LIKE 'geoLocation was different%'
GROUP BY
entity_id,
check_data,
geolocation_new
ORDER BY
distance;
SELECT * FROM geolocation_changes;
SELECT
'0 bis 25' AS category,
COUNT(*) as count_properties
FROM
geolocation_changes
WHERE
distance >= (0)
AND distance < (25)
UNION
SELECT
'25 bis 50' AS category,
COUNT(*) as count_properties
FROM
geolocation_changes
WHERE
distance >= (25)
AND distance < (50)
UNION
SELECT
'50 bis 75' AS category,
COUNT(*) as count_properties
FROM
geolocation_changes
WHERE
distance >= (50)
AND distance < (75)
UNION
SELECT
'75 bis 100' AS category,
COUNT(*) as count_properties
FROM
geolocation_changes
WHERE
distance >= (75)
AND distance < (100);
""")
def properties_exceptions(self):
return self.connection.sql("""
SELECT
JSON_EXTRACT(exception, '$.status') AS exception_status,
COUNT(JSON_EXTRACT(exception, '$.status')) AS exception_count
FROM
consultancy_d.exceptions
WHERE
type != 'property'
GROUP BY
JSON_EXTRACT(exception, '$.status')
""")
def extractions(self):
return self.connection.sql(f"""
SELECT
JSON_EXTRACT(body, '$.content.days') as calendar,
property_id,
created_at
FROM
consultancy_d.extractions
WHERE
type == 'calendar'
ORDER BY
property_id
""")
def extractions_for(self, property_id):
return self.connection.sql(f"""
SELECT
JSON_EXTRACT(body, '$.content.days') as calendar,
property_id,
created_at
FROM
consultancy_d.extractions
WHERE
type == 'calendar' AND
property_id = {property_id}
ORDER BY
property_id
""")
# Anzahl der extrahierten properties pro Exktraktionsvorgang
def properties_per_extraction(self, property_id):
return self.connection.sql("""
SELECT
COUNT(property_id),
strftime(created_at, '%Y-%m-%d') AS date
FROM
consultancy_d.extractions
WHERE
type == 'calendar'
GROUP BY
date
ORDER BY date ASC
""")
def price(self):
return self.connection.sql("""
SELECT
JSON_EXTRACT(body, '$.content.lowestPrice.valueWeekRaw') AS pricePerWeek,
JSON_EXTRACT(body, '$.content.lowestPrice.valueNightRaw') AS pricePerNight,
JSON_EXTRACT(body, '$.content.lowestPrice.currency') AS currency,
property_id,
created_at
FROM
consultancy_d.extractions
WHERE
type == 'price'
ORDER BY property_id
""")
def price_developement_per_property(self):
return self.connection.sql("""
SELECT
JSON_EXTRACT(body, '$.content.lowestPrice.valueNightRaw') AS pricePerNight,
property_id,
created_at
FROM
consultancy_d.extractions
WHERE
type == 'price'
ORDER BY property_id
""")

View File

@ -1,47 +0,0 @@
import polars as pl
import json
from datetime import datetime, timedelta
def expansion_Pipeline(df):
'''
Rearranges a given extractions Dataframe into an expanded Dataframe.
New Columns :propId, created_at calendar_date, calendar_value
:param df: Inputs from database.py/extractions or database.py/extractions_for functions
:return: expanded dataframe
'''
data = []
for row in df.iter_rows():
propId = row[1]
createdAt = row[2]
if row[0]:
temp = json.loads(row[0])
keys = temp.keys()
for key in keys:
out = [propId, createdAt.date(), datetime.strptime(key, '%Y-%m-%d').date(), temp[key]]
data.append(out)
df = pl.DataFrame(data, schema=["property_id", "created_at", "calendar_date", "calendar_value"])
return df
def liveDates_Pipeline(df):
'''
Returns the expanded Dataframe with only the live data and no future data
:param df: Inputs from database.py/extractions or database.py/extractions_for functions
:return: expanded and filtered dataframe
'''
df = expansion_Pipeline(df)
print(df)
df = df.filter(pl.col("calendar_date") == pl.col("created_at")+timedelta(days=2))
return df
def liveDates_PipelineFromExpanded(df):
'''
Filters an already expanded df and returns only the live data and no future data
NOTE: The actual live date and the next is always 0. The reason is most likely that it is forbidden to
book on the current or next day. Workaround: Compare with the day after tomorrow
:param df: Inputs from expansion_Pipeline
:return: expanded and filtered dataframe
'''
df = df.filter(pl.col("calendar_date") == pl.col("created_at")+timedelta(days=2))
return df

File diff suppressed because one or more lines are too long

View File

@ -1,16 +0,0 @@
import polars as pl
import data
inst = data.load()
test = inst.extractions_for(1).pl()
out = test.with_columns(
pl.col("calendar").str.extract_all(r"([0-9]{4}-[0-9]{2}-[0-9]{2})|[0-2]").alias("extracted_cal"),
)
out = out.drop(['calendar', 'property_id'])
print(out.to_dict(as_series=True))

View File

@ -1,199 +0,0 @@
import MySQLdb
import json
from datetime import datetime, timedelta
import numpy as np
def getPropertyDataFromDB():
db = MySQLdb.connect(host="localhost",user="root",passwd="admin",db="consultancy")
cur = db.cursor()
cur.execute("SELECT id, seed_id, check_data "
"FROM properties ")
propData = cur.fetchall()
db.close()
return propData
def getDataFromDB(propId):
'''
Function to get data from MySQL database filter with the given propId
:return: scrapeDates and calendarData
'''
db = MySQLdb.connect(host="localhost",user="root",passwd="admin",db="consultancy")
cur = db.cursor()
cur.execute("SELECT JSON_EXTRACT(header, '$.Date') "
"FROM extractions "
f"WHERE type='calendar' AND property_id = {propId};")
scrapeDates = cur.fetchall()
cur.execute("SELECT JSON_EXTRACT(body, '$.content.days') "
"FROM extractions "
f"WHERE type='calendar' AND property_id = {propId};")
calendarData = cur.fetchall()
db.close()
return scrapeDates, calendarData
def getUniqueScrapeDates():
db = MySQLdb.connect(host="localhost",user="root",passwd="admin",db="consultancy")
cur = db.cursor()
cur.execute("SELECT JSON_EXTRACT(header, '$.Date') "
"FROM extractions "
f"WHERE type='calendar'")
uniqueScrapeDates = cur.fetchall()
db.close()
return uniqueScrapeDates
def getPropsPerScrape(scrapeDate):
date = datetime.strptime(scrapeDate, '%Y-%m-%d')
end_date = date + timedelta(days=1)
db = MySQLdb.connect(host="localhost",user="root",passwd="admin",db="consultancy")
cur = db.cursor()
cur.execute("SELECT property_id "
"FROM extractions "
f"WHERE type='calendar' AND created_at > '{scrapeDate}' AND created_at < '{str(end_date)}'")
uniqueScrapeDates = cur.fetchall()
db.close()
return uniqueScrapeDates
def getuniquePropIdFromDB():
'''
Function to get unique propId from MySQL database
:return: propList
'''
db = MySQLdb.connect(host="localhost",user="root",passwd="admin",db="consultancy")
cur = db.cursor()
cur.execute("SELECT DISTINCT property_id "
"FROM extractions;")
propIds = cur.fetchall()
db.close()
propList = []
for propId in propIds:
propList.append(propId[0])
return propList
def reformatScrapeDates(scrapeDatesIn):
'''
Reformats the scrapeDates column to a shortened datetime format
:param scrapeDatesIn:
:return:
'''
scrapeDates = []
for row in scrapeDatesIn:
date = datetime.strptime(json.loads(row[0])[0], '%a, %d %b %Y %H:%M:%S %Z').date()
str = date.strftime('%Y-%m-%d')
scrapeDates.append(str)
return scrapeDates
def checkForLostProprty(calendarData):
'''
Checks if there are "None" Entries in the calendarData meaning they were no longer found
:param calendarData:
:return: Boolean indicating if there are "None" Entries in the calendarData
'''
for row in calendarData:
if None in row:
return True
return False
def getMinMaxDate(calendarData):
'''
Gets the min and max values from a calendar data
:param calendarData: get all calendar data from querry
:return: the minimal and maximal date
'''
#minimales und maximales Datum ermitteln
fullDateList = []
for row in calendarData:
tempJson = json.loads(row[0]).keys()
for key in tempJson:
#print(key)
fullDateList.append(datetime.strptime(key, '%Y-%m-%d').date())
end_dt = max(fullDateList)
start_dt = min(fullDateList)
delta = timedelta(days=1)
HeaderDates = []
while start_dt <= end_dt:
HeaderDates.append(start_dt)
start_dt += delta
return HeaderDates
def creatDataMatrix(HeaderDates, calendarData):
'''
Creates the data matrix from a calendar data
:param HeaderDates: The list of all possible Dates in the dataset is used as the headers
:param calendarData: the main information from the sql querry
:return: data Matrix with all the dates in the dataset
'''
data = []
for row in calendarData:
tempList = [-1] * len(HeaderDates)
tempJson = json.loads(row[0])
for key in tempJson:
date = datetime.strptime(key, '%Y-%m-%d').date()
content = tempJson[key]
index = [i for i, x in enumerate(HeaderDates) if x == date]
tempList[index[0]] = content
data.append(tempList)
return data
def getAccuracy(df, baseLine, compLine):
'''
Calculates the accuracy of a given dataframe with a given baseLine and compLine
:param df:
:param baseLine:
:param compLine:
:return: Accuracy: The percentage of dates that had the same information in both baseLine and compLine
'''
try:
df = df.iloc[[baseLine,compLine]]
except IndexError:
return -1
total = 0
noChange = 0
first = True
for series_name, series in df.items():
if first:
first = False
else:
total += 1
#print(series_name)
if series[baseLine] != -1:
if series[compLine] != -1:
if series[baseLine] == series[compLine]:
noChange += 1
accuracy = noChange / total
return accuracy
def getMeanAccuracy(accList):
'''
Get the mean Accuracy of the entire timedelay of one property
:param accList: List of accuracy Values of a comparison
:return: Average of the accuracy values while ignoring the '-1' values
'''
out = []
for row in accList:
row = [x for x in row if x != -1]
out.append(np.average(row))
return out

View File

@ -1,83 +0,0 @@
from datetime import datetime, timedelta
import json
import MySQLdb #Version 2.2.4
import pandas as pd #Version 2.2.3
import plotly.express as px #Version 5.24.1
db = MySQLdb.connect(host="localhost",user="root",passwd="admin",db="consultancy")
cur = db.cursor()
cur.execute("SELECT JSON_EXTRACT(header, '$.Date') "
"FROM extractions "
"WHERE type='calendar' AND property_id = 200;")
dateoutput = cur.fetchall()
cur.execute("SELECT JSON_EXTRACT(body, '$.content.days') "
"FROM extractions "
"WHERE type='calendar' AND property_id = 200;")
output = cur.fetchall()
db.close()
#createScrapedate Liste
ytickVals = list(range(0, 30, 5))
scrapeDates = []
#print(dateoutput)
for row in dateoutput:
date = datetime.strptime(json.loads(row[0])[0], '%a, %d %b %Y %H:%M:%S %Z').date()
str = date.strftime('%d/%m/%Y')
scrapeDates.append(str)
#minimales und maximales Datum ermitteln
fullDateList = []
for row in output:
tempJson = json.loads(row[0]).keys()
for key in tempJson:
#print(key)
fullDateList.append(datetime.strptime(key, '%Y-%m-%d').date())
end_dt = max(fullDateList)
start_dt = min(fullDateList)
delta = timedelta(days=1)
HeaderDates = []
while start_dt <= end_dt:
HeaderDates.append(start_dt)
start_dt += delta
#Create data-Matrix
data = []
for row in output:
tempList = [-1] * len(HeaderDates)
tempJson = json.loads(row[0])
for key in tempJson:
date = datetime.strptime(key, '%Y-%m-%d').date()
content = tempJson[key]
index = [i for i, x in enumerate(HeaderDates) if x == date]
tempList[index[0]] = content
data.append(tempList)
#Transform to Dataframe for Plotly
df = pd.DataFrame(data, columns=HeaderDates)
#Generate Plotly Diagramm
colScale = [[0, 'rgb(0, 0, 0)'], [0.33, 'rgb(204, 16, 16)'], [0.66, 'rgb(10, 102, 15)'], [1, 'rgb(17, 184, 26)']]
fig = px.imshow(df, color_continuous_scale= colScale)
lines = list(range(0,30,1))
for i in lines:
#fig.add_hline(y=i+0.5, line_color="white")
fig.add_hline(y=i+0.5)
fig.update_layout(yaxis = dict(tickfont = dict(size=50))),
fig.update_layout(xaxis = dict(tickfont = dict(size=50)))
fig.update_layout(xaxis_title="Verfügbarkeitsdaten Mietobjekt", yaxis_title="Scrapingvorgang")
fig.update_xaxes(title_font_size=100, title_font_weight="bold")
fig.update_yaxes(title_font_size=100, title_font_weight="bold")
fig.update_layout(yaxis = dict(tickmode = 'array',tickvals = ytickVals, ticktext = scrapeDates))
fig.update_xaxes(title_standoff = 80)
fig.update_yaxes(title_standoff = 80)
fig.update_layout(xaxis={'side': 'top'})
fig.show()

View File

@ -1,58 +0,0 @@
import Data_Analysis as DA
import pandas as pd
accuracy = pd.read_csv(f'results/accMeanDf.csv')
propData = DA.getPropertyDataFromDB()
propData = pd.DataFrame(propData, columns =['property_id', 'region', 'geoLocation'])
propData = propData.drop(columns=['geoLocation'])
#print(propData)
merge = pd.merge(propData, accuracy, on="property_id")
#print(merge)
#1 = Heidiland, 2 = Davos, 3 = Engadin 4 = St.Moritz
heidiAcc = merge[merge['region'] == 1]
davosAcc = merge[merge['region'] == 2]
EngadAcc = merge[merge['region'] == 3]
StMorAcc = merge[merge['region'] == 4]
heidiMean = heidiAcc.mean(axis=0)
davosMean = davosAcc.mean(axis=0)
EngadMean = EngadAcc.mean(axis=0)
StMorMean = StMorAcc.mean(axis=0)
heidiSDev = heidiAcc.std(axis=0)
davosSDev = davosAcc.std(axis=0)
EngadSDev = EngadAcc.std(axis=0)
StMorSDev = StMorAcc.std(axis=0)
accuracyOverview = pd.DataFrame()
accuracyOverview.insert(0, "St. Moritz StdDev", StMorSDev, True)
accuracyOverview.insert(0, "St. Moritz Mean", StMorMean, True)
accuracyOverview.insert(0, "Engadin StdDev", EngadSDev, True)
accuracyOverview.insert(0, "Engadin Mean", EngadMean, True)
accuracyOverview.insert(0, "Davos StdDev", davosSDev, True)
accuracyOverview.insert(0, "Davos Mean", davosMean, True)
accuracyOverview.insert(0, "Heidi StdDev", heidiSDev, True)
accuracyOverview.insert(0, "Heidi Mean", heidiMean, True)
accuracyOverview.drop(index=accuracyOverview.index[0], axis=0, inplace=True)
accuracyOverview.drop(index=accuracyOverview.index[0], axis=0, inplace=True)
accuracyOverview.to_csv('results/accuracyOverview.csv', index=True)
#delete unused DF's
del merge, accuracy, propData
del heidiAcc, davosAcc, EngadAcc, StMorAcc
del heidiMean, davosMean, EngadMean, StMorMean
del heidiSDev, davosSDev, EngadSDev, StMorSDev
print(accuracyOverview)

View File

@ -1,73 +0,0 @@
import pandas as pd
import os
import re
import numpy as np
def getAccuracy(df, baseLine, compLine):
try:
df = df.iloc[[baseLine,compLine]]
except IndexError:
return -1
total = 0
noChange = 0
first = True
for series_name, series in df.items():
if first:
first = False
else:
total += 1
#print(series_name)
if series[baseLine] != -1:
if series[compLine] != -1:
if series[baseLine] == series[compLine]:
noChange += 1
accuracy = noChange / total
return accuracy
def getMeanAccuracy(accList):
out = []
for row in accList:
row = [x for x in row if x != -1]
out.append(np.average(row))
return out
deltaList = [1, 2, 10, 20]
#1 = 1 Scrape Interval
#2 = ca. 1 Woche
#10 = 1 Monat (30Tage)
#20 = 2 Monate
directory = os.fsencode("dok")
columnNames = ['property_id', 'timedelay_1', 'timedelay_2','timedelay_10','timedelay_20']
accListDf = pd.DataFrame(columns = columnNames)
accMeanDf = pd.DataFrame(columns = columnNames)
for file in os.listdir(directory):
filename = os.fsdecode(file)
if filename.endswith(".csv"):
propId = re.findall("\d+", filename)[0]
print(propId)
df = pd.read_csv(f'dok/{filename}')
fullList = []
accList = []
#Loop though all deltas in the deltaList
for delta in deltaList:
accList = []
#Loop through all Dates as Baseline date
for i in range(df.shape[0]):
acc = getAccuracy(df, i, i+delta)
accList.append(acc)
fullList.append(accList)
meanList = getMeanAccuracy(fullList)
accListDf = accListDf._append({'property_id': propId, 'timedelay_1': fullList[0], 'timedelay_2': fullList[1], 'timedelay_10': fullList[2], 'timedelay_20': fullList[3]}, ignore_index=True)
accMeanDf = accMeanDf._append({'property_id': propId, 'timedelay_1': meanList[0], 'timedelay_2': meanList[1], 'timedelay_10': meanList[2], 'timedelay_20': meanList[3]}, ignore_index=True)
accListDf.to_csv('results/accListDf.csv', index=False)
accMeanDf.to_csv('results/accMeanDf.csv', index=False)

View File

@ -1,20 +0,0 @@
import Data_Analysis as DA
import csv
propIds = DA.getuniquePropIdFromDB()
lostProperties = []
for propId in propIds:
print(propId)
scrapeDates, calendarData = DA.getDataFromDB(propId)
if DA.checkForLostProprty(calendarData):
lostProperties.append(propId)
print(f"{len(lostProperties)} of {len(propIds)} properties are lost")
with open('results/allLostProperties', 'w') as f:
write = csv.writer(f)
write.writerow(lostProperties)
#Output: 221 of 1552 properties were lost at some point

View File

@ -1,28 +0,0 @@
import Data_Analysis as DA
import pandas as pd
import os
propIds = DA.getuniquePropIdFromDB()
for propId in propIds:
name = f"dok/calendarData_prop{propId}.csv"
if not os.path.exists(name):
print(propId)
scrapeDates, calendarData = DA.getDataFromDB(propId)
if DA.checkForLostProprty(calendarData):
print(f"Lost Proprty: {propId}")
else:
scrapeDates = DA.reformatScrapeDates(scrapeDates)
HeaderDates = DA.getMinMaxDate(calendarData)
data = DA.creatDataMatrix(HeaderDates, calendarData)
# Transform to Dataframe for Plotly
df = pd.DataFrame(data, columns=HeaderDates)
df.insert(0, "ScrapeDate", scrapeDates, True)
df = df.drop(index=0) # Irregulärer Abstand in den Scraping Zeiten (nur 2 Tage)
df = df.drop(df.columns[[1, 2]], axis=1)
df.to_csv(name, index=False)

View File

@ -1,32 +0,0 @@
import Data_Analysis as DA
import pandas as pd
#Alle Scrape Dates auslesen, umformatieren und doppelte Löschen
uniqueScrapeDates = DA.getUniqueScrapeDates()
uniqueScrapeDates = DA.reformatScrapeDates(uniqueScrapeDates)
uniqueScrapeDates= list(dict.fromkeys(uniqueScrapeDates))
#print(uniqueScrapeDates)
#Liste der Listen der properties pro Scrape Datum erstellen
fullPropList = []
for date in uniqueScrapeDates:
propList = []
strDate = date
properties = DA.getPropsPerScrape(strDate)
for prop in properties:
propList.append(prop[0])
propList = list(dict.fromkeys(propList))
fullPropList.append(propList)
#print(propList)
print(fullPropList)
#zu DF umwandeln, mit Property ID's in the Spaltennamen und One-Hot-Encoding
all_property_ids = sorted(set([item for sublist in fullPropList for item in sublist]))
print(all_property_ids)
df = pd.DataFrame(0, index=range(len(fullPropList)), columns=all_property_ids)
for i, property_list in enumerate(fullPropList):
df.loc[i, property_list] = 1
df.to_csv('results/PropertiesPerScrape.csv', index=True)
print(df)

File diff suppressed because one or more lines are too long

Some files were not shown because too many files have changed in this diff Show More