project_name,oneliner,git_namespace,git_url,platform,topics,rubric,last_commit_date,stargazers_count,number_of_dependents,stars_last_year,project_active,dominating_language,organization,organization_user_name,languages,homepage,readme_content,refs,project_created,project_age_in_days,license,total_commits_last_year,total_number_of_commits,last_issue_closed,open_issues,closed_pullrequests,closed_issues,issues_closed_last_year,days_until_last_issue_closed,open_pullrequests,reviews_per_pr,development_distribution_score,last_released_date,last_release_tag_name,good_first_issue,contributors,accepts_donations,donation_platforms,code_of_conduct,contribution_guide,dependents_repos,organization_name,organization_github_url,organization_website,organization_location,organization_country,organization_form,organization_avatar,organization_public_repos,organization_created,organization_last_update
pvlib-python,A set of documented functions for simulating the performance of photovoltaic energy systems.,pvlib,https://github.com/pvlib/pvlib-python.git,github,"solar-energy,python,renewable-energy,renewables,photovoltaic",Photovoltaics and Solar Energy,"2023/10/23, 15:15:23",958,431,214,true,Python,pvlib,pvlib,"Python,TeX,Cython,Pan",https://pvlib-python.readthedocs.io,"b'
\n\n
\n\n\npvlib python is a community supported tool that provides a set of\nfunctions and classes for simulating the performance of photovoltaic\nenergy systems. pvlib python was originally ported from the PVLIB MATLAB\ntoolbox developed at Sandia National Laboratories and it implements many\nof the models and methods developed at the Labs. More information on\nSandia Labs PV performance modeling programs can be found at\nhttps://pvpmc.sandia.gov/. We collaborate with the PVLIB MATLAB project,\nbut operate independently of it.\n\n\nDocumentation\n=============\n\nFull documentation can be found at [readthedocs](http://pvlib-python.readthedocs.io/en/stable/),\nincluding an [FAQ](http://pvlib-python.readthedocs.io/en/stable/user_guide/faq.html) page.\n\nInstallation\n============\n\npvlib-python releases may be installed using the ``pip`` and ``conda`` tools.\nPlease see the [Installation page](https://pvlib-python.readthedocs.io/en/stable/user_guide/installation.html) of the documentation for complete instructions.\n\n\nContributing\n============\n\nWe need your help to make pvlib-python a great tool!\nPlease see the [Contributing page](http://pvlib-python.readthedocs.io/en/stable/contributing.html) for more on how you can contribute.\nThe long-term success of pvlib-python requires substantial community support.\n\n\nCiting\n======\n\nIf you use pvlib-python in a published work, please cite:\n\n William F. Holmgren, Clifford W. Hansen, and Mark A. Mikofski.\n ""pvlib python: a python package for modeling solar energy systems.""\n Journal of Open Source Software, 3(29), 884, (2018).\n https://doi.org/10.21105/joss.00884\n\nPlease also cite the DOI corresponding to the specific version of\npvlib-python that you used. pvlib-python DOIs are listed at\n[Zenodo.org](https://zenodo.org/search?page=1&size=20&q=conceptrecid:593284&all_versions&sort=-version)\n\nIf you use pvlib-python in a commercial or publicly-available application, please\nconsider displaying one of the ""powered by pvlib"" logos:\n\n![]()
\n\nGetting support\n===============\n\npvlib usage questions can be asked on\n[Stack Overflow](http://stackoverflow.com) and tagged with\nthe [pvlib](http://stackoverflow.com/questions/tagged/pvlib) tag.\n\nThe [pvlib-python google group](https://groups.google.com/forum/#!forum/pvlib-python)\nis used for discussing various topics of interest to the pvlib-python\ncommunity. We also make new version announcements on the google group.\n\nIf you suspect that you may have discovered a bug or if you\'d like to\nchange something about pvlib, then please make an issue on our\n[GitHub issues page](https://github.com/pvlib/pvlib-python/issues).\n\n\n\nLicense\n=======\n\nBSD 3-clause.\n\n\nNumFOCUS\n========\n\npvlib python is a [NumFOCUS Affiliated Project](https://numfocus.org/sponsored-projects/affiliated-projects)\n\n[](https://numfocus.org/sponsored-projects/affiliated-projects)\n'",",https://doi.org/10.5281/zenodo.593284,https://doi.org/10.21105/joss.00884\n\nPlease,https://zenodo.org/search?page=1&size=20&q=conceptrecid:593284&all_versions&sort=-version","2015/02/17, 00:21:33",3172,BSD-3-Clause,124,1555,"2023/10/21, 12:52:12",205,852,1578,263,4,42,3.4,0.542319749216301,"2023/09/21, 20:02:01",v0.10.2,8,105,false,,true,true,"andreabotti/itacca,NREL/EVOLVE,atiqureee51/energy-financial-PV,Satyabhama-Reddy/shadow-analysis,nonzchanon/hello-streamlit,realnumber666/smart-city,sravya97/Shadow-Analysis,shreyasskasetty-tamu/shadow_analysis,Patrickyyh/shadow_analysis_yuhao,IsurangaPerera/shadow-analaysis,woodjmichael/PROPHET-Load_LSTM,Haitham-ghaida/MCC,OneMogin/model_hofmann,TimWalter/solar-power-estimator,blazdob/consmodel,riparise/solar-panel-sizing-tool,D-Agar/CanSat-Data-Client,cutmyenergybill/domestic-energy-bill-reduction-app,MainakRepositor/Activation-Infopedia,haydaroglu/SolarEnergyPredictionSuite,openclimatefix/nowcasting_forecast,makkorjamal/hsdigitizer,openclimatefix/nowcasting_dataset,fulvio9999/PROGETTO-BD,justinfmccarty/morph-app,fgbg03/PROCSIM-Running-Results-Frontend,piotre13/NODES_platform,RWTH-EBC/AixWeather,justinfmccarty/pyepwmorph,pail23/energy-assistant-backend,heetbeet/heliostat-prototype,DavidMbx/Tesi_DavideCasuccio_SimulationTool,maniraman1982/Portola-master,AEJaspan/irradianceforecasting,matiasctrs/app-apv,Deutsches-Brennstoffinstitut/H2-Index-III,manitcs1982/PortolaProd2024,shadingfish/Dingyi_Yu-Digital-Agri-Lab,owentmfoo/WSCForecast,interuss/monitoring,manitcs1982/portolauatlocal,maniraman1982/uat,maniraman1982/portola-UAT,hmendo/chrpa,Shkryob/kostya-optimizer,hmendo/chrpa_test,Geoandres321/Trackers,Victor19970418/Power-consumption-forecast,manitcs1982/HaveBlue-root,cs224/jupyter-docker-hello-world,zhaoshouhang/pandas_learn,amritPVre/Solar_energy_estimator_v01,NREL/OCHRE,flecksi/container_test,stephansmit/heliostrome,AI-nergy/aiecommon,langestefan/pvcast,Deutsches-Brennstoffinstitut/DBI-MAT,mbari-org/pypam-based-processing,chz056/BEAR,cai-yutian/pv-modelling-mirror,NREL/lore,gereon-t/oaemapi,Mousa-Zerai/HiSim,marcwatine/CC-LUT,BETALAB-team/EUReCA,owentmfoo/dusc_race_strategy_archive,jucrramirez/OrnamentaRainbow,shirubana/Approaches2BifacialPerformanceMonitoring,isc-konstanz/th-e-fcst,JustinZarb/natural_maps,cedricleroy/solpos,Hoshang111/lcoe-model,Sim-on-Wheels/Sim-on-wheels-Renderer,wago-stiftung/workshop_data-analytics,Nikita-Belyakov/CLOUD_SNOW_SEGMENTATION,kaiATtum/battery-swapping-station-model,SMEISEN/AutoPV,the-aerospace-corporation/selenium,pdb-94/miguel,dcambie/LSC-PM_solar_miniplant,occamssafetyrazor/deps,SETO2243/forecasting,tug-cps/inframonitor-public,amosnjenga/pcsrt_python_cli,PVSC-Python-Tutorials/PVSC50,in-RET/inretensys-fastapi,Mizunomori/NEM_Pricing,Lodinn/PAM-timeseries,2022Yalin/MSc,CDT-AIMLAC/team_7564616d_models,NOWUM/smartdso,pombredanne/5000-deps,amritPVre/inter-row_beta,in-RET/in.RET-EnSys-open-plan-GUI,tribp/solar-forecast-api,salazarna/synthetic-irradiance-sequence,Abercardsea/Abercardsea-solution,Hykire/photovoltaic_simulation,HZBSolarOptics/pv_tandem,JMMonte/Reentry,LuisGarayF/shipcal_cl,slacgismo/pv-validation-hub,streetplantsolar/enactment-eyelash-strike-a4mm,heliostrome/streamlit_app,guilhermedcastr0/simulador_solar,sebastienlanglois/dash-gis,kabirnund/PROTOTYPE,Constructor-999/PlantSat-CanSat,kperrynrel/time-shift-validation-hub,DarkBrain-LP/Essivi-API,MIPU-MPH/i-nergy,bladekeys/sipi_project,xinhhh/capstone,openclimatefix/gfs-downloader,nicosquare/rl-energy-management,mikecoughlan/multi-station-dbdt-risk-assessment,JonathanRJ404/Code_Etalonnage,data-overflow/solar-output-predictor,kianwasabi/Weather_Information_API,areyc2023/cno,zakwatts/NWP_Island_data,ski907/historicHeatFlux,openclimatefix/pv-site-api,AssessingSolar/AssessingSolar,ijbd/merra-power-generation,dtcc-platform/dtcc-solar,openclimatefix/pv-site-production,antoine-zurcher/PreSimulatorPV,StefaE/PVForecast,pvlib/solarfactors,MahrokhGB/Synthetic_PV_Profiles,openclimatefix/uk-pv-national-xg,zsteenson/GUI-two-axis-tracker-controller,owentmfoo/S5,kandersolar/twoaxistracking,wholmgren/twoaxistracking,TAMUparametric/energiapy,bitenergie/notes,gitter-badger/thea-1,openclimatefix/ocf-ml-metrics,ekck/solar-power-gen-analysis,openclimatefix/pv-solar-farm-forecasting,nicolasdesilles/PIRD,mark-mcnulty/two-axis-tracker-controller-code,UU-ER/EHUB-Py_Training,sofuetakuma112/pvpmc2022,lgaspard/repf,NOWUM/dmas,kike144/Solar,alksarioglou/compression_algorithms_fpga_ultrasound,NREL/PVDegradationTools,json-0201/weather_and_solar_summary,saif-byte/masterpiece_limited,kkriti01/pvsimulator,Apoorva64/mauritius-solar-pannel-placement-analysis,DitisAlex/HAN-OOSE-Prediction-Model-BackEnd,ski907/hff,openclimatefix/pv-site-prediction,Tsiri/RHEIA,19conte/TR_datasophia,DTUWindEnergy/hydesign,MORE-EU/hackathon_athena,NCAR/mlsurfacelayer,NCAR/grafs,ttjaden/vdi4657.app,nhcho91/active-weighting-cmrac,kfiramar/baldar,cambridge-cares/TheWorldAvatar,tadatoshi/agrivoltaics_supply_side_management,FedericoTartarini/tool-risk-scale-football-nsw,Yasmin1209/Freshwater-PV,openclimatefix/PVItaly,ArcticSnow/TopoPyScale,flecksi/pv_design_app,SepehrMosavat/PROGNOES,AlexandreHugoMathieu/pvfault_detection,BenWinchester/HEATDesalination,energyris-com/ai4hotels-demo,ShayanNaderi/PrecoolTool,slacgismo/pv-apache-beam,icaromisquita/WBS-Final-Project-PV-energy-prediction-for-Germany,rseng/rsepedia-analysis,magneb/skycolor,dvdjng/fpvwebsite,stefvra/energy_app,usnistgov/NIST_SG_SolarSim,JPCM95/portfo,simonneidhart/pv_mapping_ch,kaustuvchatterjee/vskp,openclimatefix/ocf_datapipes,pollination/sample-apps,flopaw/pvforecast-docker,moritz-reuter/ESEM-EE,PVSC-Python-Tutorials/PVPMC_2022,cwhanse/ivcurves,brandonhanner/Solar-Simulation,nikohou/pv-data-utility-suite,AssessingSolar/solarstations,openclimatefix/power_perceiver,LBNL-ETA/AFC,olousgap/Combi_CSP,VMLC-PV/PVLC_Diode_Fit,OpenSTEF/openstef,joaoguilhermeS/Web-Development,cire-thk/BifacialSimu,npapnet/Combi_CSP,pranavsinghal30/PVForecast,tongpu/solar-power-simulation,BETALAB-team/eureca-building,slacgismo/gismo-cloud-deploy,jranalli/solartoolbox,jjcaine/top_dependencies_python,LE2P/pybsrnqc,oie-mines-paristech/IEA_PVPS_T16_QC_pynb,msoutojr/IFT-6759-Photovoltaic-forecast,nicosquare/ml-707-project,EvYogi/Rivesaltes,AndresPadillaUcros/PhotovoltaicForecast,mj-xmr/SolOptXMR,e-marco/onetmy,marcoboucas/irradiance,modusV/skia,sandialabs/CFTrack,enermaps/enermaps,helvecioneto/goes-data-toolkit,sophiegribben/EM401,davidusb-geek/emhass-add-on,aimlac-7564616D/models,cycle13/climate,abhijeetmokate/folder9876,zahraghh/DESweatherAnalysis,jsl12/plants,mtress/mtress,SeitaBV/flexmeasures-openweathermap,blaevens/PV_mcmc,ewilczynski/enermaps,energyscope/EnergyScope_multi_criteria,viktor-platform/sample-solar-panel-configurator,bhavyadureja25/ds_tool_with_upload,Sourabh470/IoT_Dashboard,shortfellow/dashapp,sunt05/SuPy,Caedin/RaspberryPiCamera,RubenVanEldik/ect2-bonus-project,instrat-pl/pypsa-pl,ZaninMarco/Copy-of-repository,mape-maker/mape-maker,Adrianonsare/EnergyAnalytics,covetool/clima,AnnabelNkir/La_Galerie,vondeF/solar_prediction-master,AIMLAC-Convergence/Convergence,YashasviBhatt/transformer_streaming,Bimal-Kumar-002/Sentiment-classification,inrae/SISPPEO,burundiocibu/pvlib-prometheus-exporter,fernandochacon92/AAM_APP_MA_2021,SteveShin9330/PV_PRED_LDAPS,PVSC-Python-Tutorials/pyData-2021-Solar-PV-Modeling,andresgm/cno_solar,rehomewebapp/REhome,rohitsanam/streamlit-basic-app,FZJ-IEK3-VSA/HiSim,tjcoathu/bifacial_radiance,AnaDue99/microCPV_Iluminacion,zahraghh/multi_objective_optimization,martinrteran/Taller_Python3,hancse/ai_model_deployment,amritPVre/Hourly_GHI_GII_App,davidusb-geek/emhass,thiagohgmello1/pvsystem,AvG97/Avocado-Analyzer,yaotc/PVODataset,waggle-sensor/plugin-solar-irradiance,AdamRJensen/adamrjensen.github.io,IMMM-SFA/diyepw,slacgismo/solar-data-tools,UlisseProject/ULISSE,isi-ies-group/hiperion_app,bershawi/PV-simulator-challenge,SESMG/SESMG,JoseDiego101/Blog,h-quest/FDDA_app,sandialabs/pecos,nxtrung87/Voila,rheia-framework/RHEIA,PVSC-Python-Tutorials/PVSC48-Python-Tutorial,sandialabs/pvOps,isc-konstanz/pvsys,UARENForecasting/ESPRR,thesethtruth/LESO,tinoetzold/PV-Prognose,cccwam/ift6759-project1-public,arkayyy/surya-disha,EURAC-EEBgroup/CULTURAL-E-Data-Visualization-Library,zahraghh/Two_Stage_SP,gpschnaars/zac_pvlib,francomozo/deepCloud,Saq90/ssg,gripenergy/omf,sillygoose/sbhistory,BBISSlab/DistributedEnergyGen,Vladykart/GPReport,mikofski/PVRW2021,mikofski/OLD-PVSC48-Python-Tutorial,CenterForTheBuiltEnvironment/clima,nacho-fm/glare,Peque/pvlib-procedural-test,SolarPerformanceInsight/solarperformanceinsight,UARENForecasting/erebos-ams2020,benjaminpillot/greece,Weiming-Hu/RenewableSimulator,SolarGenomeProject/uncertainty-pv-generation-models,rayarka/EG3301R_Data_and_Programs,jeanromainroy/spatial-prediction-afg-landmines,Vesino/Exergy-YDX,brizett/reegis_hp,Rumbelstilzchen/Monitoring,Vesino/pvlib_tesis,louis-richard/irfu-python,abachleda/FM_e-services,ihomelab/effect-of-sampling-rate-on-PV-self-consumption,DuraMAT/pvpro,oemof-heat/solar_models,traiyn/abcdtft,gayashiva/air_model,selmaneislam/rdootl,cdeline/bifacialvf,cogent-computing/Heed-Microcontroller,TheUninvitedGuest/tmh-challenge,zoezhang926/SolarMartSim,prathamrg/solarapp,Adrian658/ClearSkyDetection,jcjveraa/sunscreen-controller,RaphiOriginal/blindAutomation,jungse12/Ford-Deploy,jungse12/Ford-Website,Gkrumbach07/solar_forecaster,pvedu/pvon,danielvanpaass/Demonstrator,cropsinsilico/hothouse,chrisorner/PowerPlan,uiandwe/django-best-practice-example,emanuelosva/solaru,simardeep1792/LOHS,keenchan/111,keenchan/pv,david-salac/Renewable-energy-prediction,bradbase/flyingkoala_pvlib,DanielFonteneleNogueira/painel,quintel/scenario-tools,kwhanalytics/data-env,UniOfLeicester/plotdarn,greco-project/pvcompare,HrushikeshBodas/HRES_optimization,mesmo-dev/mesmo,yaricp/py-solarhouse,kwhanalytics/insurance-requirements,pvcaptest/pvcaptest,iansloop/PV_Mismatch_Refactoring,pvlib/pvanalytics,ECE-492-W2020-Group-6/smart-blinds-rpi,greco-project/greco_technologies,UARENForecasting/erebos,maxmills1/Irradiance,vegraux/solar_path,panzer/capstone-mppt,david-salac/Fast-SZA-and-SAA-computation,kwhanalytics/postgis-heliostats-py3.5,nicolasholland/yuce,nano-sippe/bifacial_illumination,FZJ-IEK3-VSA/RESKit,oemof/oemof-thermal,tylunel/pvpumpingsystem,Pyosch/vpplib,renewables-ninja/gsee,greco-project/cpvtopvlib,FZJ-IEK3-VSA/tsib,NREL/Solar-Forecasting,Jeffrey-Simpson/Solar-Forecasting,mikofski/pvwatts_emulator,toddkarin/vocmax,cbaretzky/BuildingEnergySimulation,Ekistica/irradiance_synth,coroa/tmhpvsim,MichaelHopwood/PVPolyfit,kwhanalytics/marvin,rory87/caledonia-energy,SolarArbiter/workshop,harpreet153/pv-graph,toddkarin/pvtools,duplessisaa/Docker,SolarArbiter/solarforecastarbiter-core,mesmo-dev/cobmo,25sal/pvprediction,josephmckinsey/NRELHackathon2018,BLM-UoR/BLM,NREL/pv_tomcat,be-lb/solar-loader,higab85/off-grid,buds-lab/the-building-data-genome-project,SunPower/pvfactors,hackingmaterials/duramat_dashboard,kwhanalytics/postgis-marvin,hackingmaterials/ivtools,michbeg/IBM-DSX,teamvirtue/smart-sockets-dev,JohannesBertens/InSolarBase,JasonTarzan/PV_Forecast,eckara/CSSS,heliotrope-energy/bifacial,louisguitton/ifp-class-ml,BreakingBytes/pvfree,BreakingBytes/simkit,tplemmens/InSolar,mikofski/pvsc44-clearsky-aod,Solcast/howto-pandas,vkinakh/calc-solar-panel-effectivness,WalterGoedecke/virtualenv,BreakingBytes/UncertaintyWrapper,open-fred/lib_validation,SunPower/PVMismatch,reegis/reegis,adriandole/pv_timelapse,isc-konstanz/emonpv,oemof/feedinlib,NREL/bifacialvf,NREL/rdtools,NREL/bifacial_radiance,nshakhat/PVSimulator",,https://github.com/pvlib,,,,,https://avatars.githubusercontent.com/u/11037261?v=4,,,
pvfactors,Open source view-factor model for diffuse shading and bifacial PV modeling.,SunPower,https://github.com/SunPower/pvfactors.git,github,"solar-energy,renewable-energy,python,bifacial",Photovoltaics and Solar Energy,"2022/02/22, 21:53:32",75,13,10,true,Python,SunPower,SunPower,"Python,Makefile",http://sunpower.github.io/pvfactors/,"b'pvfactors: irradiance modeling made simple\n==========================================\n\n|Logo|\n\n|CircleCI| |License| |PyPI-Status| |PyPI-Versions|\n\npvfactors is a tool used by PV professionals to calculate the\nirradiance incident on surfaces of a photovoltaic array. It relies on the use of\n2D geometries and view factors integrated mathematically into systems of\nequations to account for reflections between all of the surfaces.\n\npvfactors was originally ported from the SunPower developed \'vf_model\' package, which was introduced at the IEEE PV Specialist Conference 44 2017 (see [#pvfactors_paper]_ and link_ to paper).\n\n------------------------------------------\n\n.. contents:: Table of contents\n :backlinks: top\n :local:\n\n\nDocumentation\n-------------\n\nThe documentation can be found `here `_.\nIt includes a lot of tutorials_ that describe the different ways of using pvfactors.\n\n\nQuick Start\n-----------\n\nGiven some timeseries inputs:\n\n\n.. code:: python\n\n # Import external libraries\n from datetime import datetime\n import pandas as pd\n\n # Create input data\n df_inputs = pd.DataFrame(\n {\'solar_zenith\': [20., 50.],\n \'solar_azimuth\': [110., 250.],\n \'surface_tilt\': [10., 20.],\n \'surface_azimuth\': [90., 270.],\n \'dni\': [1000., 900.],\n \'dhi\': [50., 100.],\n \'albedo\': [0.2, 0.2]},\n index=[datetime(2017, 8, 31, 11), datetime(2017, 8, 31, 15)])\n df_inputs\n\n\n+---------------------+--------------+---------------+--------------+-----------------+--------+-------+--------+\n| | solar_zenith | solar_azimuth | surface_tilt | surface_azimuth | dni | dhi | albedo |\n+=====================+==============+===============+==============+=================+========+=======+========+\n| 2017-08-31 11:00:00 | 20.0 | 110.0 | 10.0 | 90.0 | 1000.0 | 50.0 | 0.2 |\n+---------------------+--------------+---------------+--------------+-----------------+--------+-------+--------+\n| 2017-08-31 15:00:00 | 50.0 | 250.0 | 20.0 | 270.0 | 900.0 | 100.0 | 0.2 |\n+---------------------+--------------+---------------+--------------+-----------------+--------+-------+--------+\n\n\nAnd some PV array parameters\n\n\n.. code:: python\n\n pvarray_parameters = {\n \'n_pvrows\': 3, # number of pv rows\n \'pvrow_height\': 1, # height of pvrows (measured at center / torque tube)\n \'pvrow_width\': 1, # width of pvrows\n \'axis_azimuth\': 0., # azimuth angle of rotation axis\n \'gcr\': 0.4, # ground coverage ratio\n }\n\nThe user can quickly create a PV array with ``pvfactors``, and manipulate it with the engine\n\n\n.. code:: python\n\n from pvfactors.geometry import OrderedPVArray\n # Create PV array\n pvarray = OrderedPVArray.init_from_dict(pvarray_parameters)\n\n\n\n.. code:: python\n\n from pvfactors.engine import PVEngine\n # Create engine\n engine = PVEngine(pvarray)\n # Fit engine to data\n engine.fit(df_inputs.index, df_inputs.dni, df_inputs.dhi,\n df_inputs.solar_zenith, df_inputs.solar_azimuth,\n df_inputs.surface_tilt, df_inputs.surface_azimuth,\n df_inputs.albedo)\n\nThe user can then plot the PV array geometry at any given time of the simulation:\n\n\n.. code:: python\n\n # Plot pvarray shapely geometries\n f, ax = plt.subplots(figsize=(10, 5))\n pvarray.plot_at_idx(1, ax)\n plt.show()\n\n.. image:: https://raw.githubusercontent.com/SunPower/pvfactors/master/docs/sphinx/_static/pvarray.png\n\n\nIt is then very easy to run simulations using the defined engine:\n\n\n.. code:: python\n\n pvarray = engine.run_full_mode(fn_build_report=lambda pvarray: pvarray)\n\n\nAnd inspect the results thanks to the simple geometry API\n\n\n.. code:: python\n\n print(""Incident irradiance on front surface of middle pv row: {} W/m2""\n .format(pvarray.ts_pvrows[1].front.get_param_weighted(\'qinc\')))\n print(""Reflected irradiance on back surface of left pv row: {} W/m2""\n .format(pvarray.ts_pvrows[0].back.get_param_weighted(\'reflection\')))\n print(""Isotropic irradiance on back surface of right pv row: {} W/m2""\n .format(pvarray.ts_pvrows[2].back.get_param_weighted(\'isotropic\')))\n\n\n.. parsed-literal::\n\n Incident irradiance on front surface of middle pv row: [1034.968 886.377] W/m2\n Reflected irradiance on back surface of left pv row: [112.139 86.404] W/m2\n Isotropic irradiance on back surface of right pv row: [0.116 1.849] W/m2\n\n\nThe users can also create a ""report"" while running the simulations that will rely on the simple API shown above, and which will look like whatever the users want.\n\n.. code:: python\n\n # Create a function that will build a report\n def fn_report(pvarray): return {\'total_incident_back\': pvarray.ts_pvrows[1].back.get_param_weighted(\'qinc\'),\n \'total_absorbed_back\': pvarray.ts_pvrows[1].back.get_param_weighted(\'qabs\')}\n\n # Run full mode simulation\n report = engine.run_full_mode(fn_build_report=fn_report)\n\n # Print results (report is defined by report function passed by user)\n df_report = pd.DataFrame(report, index=df_inputs.index)\n df_report\n\n\n+---------------------+---------------------+---------------------+\n| | total_incident_back | total_absorbed_back |\n+=====================+=====================+=====================+\n| 2017-08-31 11:00:00 | 106.627832 | 103.428997 |\n+---------------------+---------------------+---------------------+\n| 2017-08-31 15:00:00 | 79.668878 | 77.278812 |\n+---------------------+---------------------+---------------------+\n\n\n\nInstallation\n------------\n\npvfactors is currently compatible and tested with 3.6+, and is available in `PyPI `_. The easiest way to install pvfactors is to use pip_ as follows:\n\n.. code:: sh\n\n $ pip install pvfactors\n\nThe package wheel files are also available in the `release section`_ of the Github repository.\n\n\nRequirements\n------------\n\nRequirements are included in the ``requirements.txt`` file of the package. Here is a list of important dependencies:\n\n* `numpy `_\n* `pvlib-python `_\n* `shapely `_\n\n\nCiting pvfactors\n----------------\n\nWe appreciate your use of pvfactors. If you use pvfactors in a published work, we kindly ask that you cite:\n\n\n.. parsed-literal::\n\n Anoma, M., Jacob, D., Bourne, B.C., Scholl, J.A., Riley, D.M. and Hansen, C.W., 2017. View Factor Model and Validation for Bifacial PV and Diffuse Shade on Single-Axis Trackers. In 44th IEEE Photovoltaic Specialist Conference.\n\n\nContributing\n------------\n\nContributions are needed in order to improve pvfactors.\nIf you wish to contribute, you can start by forking and cloning the repository, and then installing pvfactors using pip_ in the root folder of the package:\n\n.. code:: sh\n\n $ pip install .\n\n\nTo install the package in editable mode, you can use:\n\n.. code:: sh\n\n $ pip install -e .\n\nReleasing\n+++++++++\n\nWhen releasing pvfactors, you will need to run a couple of build commands. First make sure to activate your virtual environment if any, then:\n\n- create a tag on the latest master branch commit using `git tag -a vX.X.X`, and write a tag message. You can then push that tag to Github so that it will appear there.\n- build the documentation by running `make build-docs`. When done running, you should be able to open `build/sphinx/html/index.html`, and confirm that the version displayed is the same as the one from the git tag. You can deploy by copying the content of of the `build/sphinx/html/` folder into the `gh-pages` branch of the repo (make sure to keep the `.nojekyll` file that\'s already present).\n- build the release files by running `make build-package`. When done running, you should be able to open `dist/` and see both a whl file and and tar file. Make sure that their names include the correct git tag you created. Please confirm that the whl file was built correctly by installing it locally and testing the newly released updates. You can deploy by 1) making a Github release from the tag you created and pushed, and including the files in `dist/` in the release. 2) The last step is to publish a release in PyPI, for which you can use twine and the command `twine upload dist/*`\n\n\n\n\nReferences\n----------\n\n.. [#pvfactors_paper] Anoma, M., Jacob, D., Bourne, B. C., Scholl, J. A., Riley, D. M., & Hansen, C. W. (2017). View Factor Model and Validation for Bifacial PV and Diffuse Shade on Single-Axis Trackers. In 44th IEEE Photovoltaic Specialist Conference.\n\n\n.. _link: https://pdfs.semanticscholar.org/ebb2/35e3c3796b158e1a3c45b40954e60d876ea9.pdf\n\n.. _tutorials: https://sunpower.github.io/pvfactors/tutorials/index.html\n\n.. _`full mode`: https://sunpower.github.io/pvfactors/theory/problem_formulation.html#full-simulations\n\n.. _`fast mode`: https://sunpower.github.io/pvfactors/theory/problem_formulation.html#fast-simulations\n\n.. _pip: https://pip.pypa.io/en/stable/\n\n.. _`release section`: https://github.com/SunPower/pvfactors/releases\n\n.. |Logo| image:: https://raw.githubusercontent.com/SunPower/pvfactors/master/docs/sphinx/_static/logo.png\n :target: http://sunpower.github.io/pvfactors/\n\n.. |CircleCI| image:: https://circleci.com/gh/SunPower/pvfactors.svg?style=shield\n :target: https://circleci.com/gh/SunPower/pvfactors\n\n.. |License| image:: https://img.shields.io/badge/License-BSD%203--Clause-blue.svg\n :target: https://github.com/SunPower/pvfactors/blob/master/LICENSE\n\n.. |PyPI-Status| image:: https://img.shields.io/pypi/v/pvfactors.svg\n :target: https://pypi.org/project/pvfactors\n\n.. |PyPI-Versions| image:: https://img.shields.io/pypi/pyversions/pvfactors.svg?logo=python&logoColor=white\n :target: https://pypi.org/project/pvfactors\n'",,"2018/05/14, 06:10:55",1990,BSD-3-Clause,0,104,"2023/04/28, 14:34:18",19,97,128,1,180,2,2.4,0.12903225806451613,"2022/02/22, 20:16:58",v1.5.2,0,5,false,,false,true,"matiasctrs/app-apv,Hoshang111/lcoe-model,areyc2023/cno,kike144/Solar,tadatoshi/agrivoltaics_supply_side_management,rseng/rsepedia-analysis,cire-thk/BifacialSimu,andresgm/cno_solar,thesethtruth/LESO,prathamrg/solarapp,anomam/pvfactors_iea_pvps_study,toddkarin/vocmax,toddkarin/pvtools",,https://github.com/SunPower,,,,,https://avatars.githubusercontent.com/u/1341977?v=4,,,
gsee,Global Solar Energy Estimator.,renewables-ninja,https://github.com/renewables-ninja/gsee.git,github,"solar,pandas,energy,irradiance,photovoltaic,pv,electricity,ninja",Photovoltaics and Solar Energy,"2020/07/21, 06:28:35",105,0,15,false,Python,Renewables.ninja,renewables-ninja,"Python,Makefile",https://gsee.readthedocs.io/,"b""[](https://dev.azure.com/renewables-ninja/gsee/_build?definitionId=1)\n[](https://codecov.io/gh/renewables-ninja/gsee)\n[](https://pypi.python.org/pypi/gsee)\n[](https://anaconda.org/conda-forge/gsee)\n\n# GSEE: Global Solar Energy Estimator\n\n`GSEE` is a solar energy simulation library designed for rapid calculations and ease of use. [Renewables.ninja](https://www.renewables.ninja/) uses `GSEE`.\n\nThe development of `GSEE` predates the existence of [`pvlib-python`](https://pvlib-python.readthedocs.io/) but builds on its functionality as of v0.4.0. Use `GSEE` if you want fast simulations with sensible defaults and solar energy technologies other than PV, and `pvlib-python` if you need control over the nuts and bolts of simulating PV systems.\n\n## Installation\n\n`GSEE` requires Python 3. The recommended way to install is through the [Anaconda Python distribution](https://www.continuum.io/downloads) and `conda-forge`:\n\n conda install -c conda-forge gsee\n\nYou can also install with `pip install gsee`, but if you do so, and do not already have `numpy` installed, you will get a compiler error when pip tries to build to `climatedata_interface` Cython extension.\n\n## Documentation\n\nSee the [documentation](https://gsee.readthedocs.io/) for more information on `GSEE`'s functionality and for examples.\n\n## Credits and contact\n\nContact [Stefan Pfenninger](mailto:stefan.pfenninger@usys.ethz.ch) for questions about `GSEE`. `GSEE` is also a component of the [Renewables.ninja](https://www.renewables.ninja) project, developed by Stefan Pfenninger and Iain Staffell. Use the [contact page](https://www.renewables.ninja/about) there if you want more information about Renewables.ninja.\n\n## Citation\n\nIf you use `GSEE` or code derived from it in academic work, please cite:\n\nStefan Pfenninger and Iain Staffell (2016). Long-term patterns of European PV output using 30 years of validated hourly reanalysis and satellite data. *Energy* 114, pp. 1251-1265. [doi: 10.1016/j.energy.2016.08.060](https://doi.org/10.1016/j.energy.2016.08.060)\n\n## License\n\nBSD-3-Clause\n""",",https://doi.org/10.1016/j.energy.2016.08.060","2016/09/01, 11:41:04",2610,BSD-3-Clause,0,56,"2022/03/10, 18:01:09",7,9,9,0,594,2,0.3333333333333333,0.07272727272727275,,,0,3,false,,false,true,,,https://github.com/renewables-ninja,https://www.renewables.ninja/,,,,https://avatars.githubusercontent.com/u/11838260?v=4,,,
PVMismatch,An explicit Python PV system IV & PV curve trace calculator which can also calculate mismatch.,SunPower,https://github.com/SunPower/PVMismatch.git,github,"numpy,scipy,python,solar,photovoltaic",Photovoltaics and Solar Energy,"2022/04/14, 19:15:36",63,0,13,false,Jupyter Notebook,SunPower,SunPower,"Jupyter Notebook,Python,Makefile,Batchfile",http://sunpower.github.io/PVMismatch/,"b'PVMismatch\n==========\n\nAn explicit IV & PV curve trace calculator for PV system circuits\n\nModel chain \n Cell > Cell string > Module > String > System\n \nKey Model inputs \n Cell technology characteristics \n \n Effective Irradiance (suns) \n \n Temperature (cell temperature)\n \n Bypass device configuration\n \n Cell string layout\n\n\n\n|Build Status|\n\nInstallation\n------------\n\nPVMismatch is on `PyPI `__. Install it\nwith `pip `__:\n\n::\n\n $ pip install pvmismatch\n\nRequirements\n------------\n\nPVMismatch requires NumPy, SciPy and matplotlib. These packages are available\nfrom PyPI, `Christoph Gohlke `__\nand Anaconda. You must install them prior to using PVMismatch.\n\nUsage\n-----\n\nPlease see the `documenation `__ for\ntutorials and API. Bugs and feature requests can be reported on\n`GitHub `__. The change\nhistory is also on `GitHub `__.\n\n.. |Build Status| image:: https://travis-ci.org/SunPower/PVMismatch.svg?branch=master\n :target: https://travis-ci.org/SunPower/PVMismatch\n\n\nOther Projects that use PVMismatch\n----------------------------------\nSystem level mismatch loss calculator using PVMismatch tool (STC and Annual energy loss)\nhttps://github.com/SunPower/MismatchLossStudy \n\nCiting PVMismatch\n----------------------------------\nWe appreciate your use of PVMismatch, and ask that you appropriately cite the software in exchange for its open-source publication. \n\nMark Mikofski, Bennet Meyers, Chetan Chaudhari (2018). \xe2\x80\x9cPVMismatch Project: https://github.com/SunPower/PVMismatch"". SunPower Corporation, Richmond, CA.\n\nPlease consider adding a # tag ""pvmismatch"" to a StackOverflow/Quora/LinkedIn/ResearchGate posts related to PVMismatch. \n\n\nCurrent Maintainer at SunPower\n----------------------------------\n@ahoffmanSPWR\n'",,"2013/01/23, 00:26:43",3927,BSD-3-Clause,0,415,"2022/04/14, 19:15:36",48,41,95,0,559,5,2.2,0.17341040462427748,"2019/05/11, 20:24:38",v4.1,0,8,false,,false,true,,,https://github.com/SunPower,,,,,https://avatars.githubusercontent.com/u/1341977?v=4,,,
rdtools,An open source library to support reproducible technical analysis of time series data from photovoltaic energy systems.,NREL,https://github.com/NREL/rdtools.git,github,,Photovoltaics and Solar Energy,"2023/07/31, 14:42:08",133,7,21,true,Python,National Renewable Energy Laboratory,NREL,Python,https://rdtools.readthedocs.io/,"b'
\n\nMaster branch: \n[](https://github.com/NREL/rdtools/actions?query=branch%3Amaster) \n\nDevelopment branch: \n[](https://github.com/NREL/rdtools/actions?query=branch%3Adevelopment)\n\nRdTools is an open-source library to support reproducible technical analysis of\ntime series data from photovoltaic energy systems. The library aims to provide\nbest practice analysis routines along with the building blocks for users to\ntailor their own analyses.\nCurrent applications include the evaluation of PV production over several years to obtain\nrates of performance degradation and soiling loss. RdTools can handle\nboth high frequency (hourly or better) or low frequency (daily, weekly,\netc.) datasets. Best results are obtained with higher frequency data.\n\nRdTools can be installed automatically into Python from PyPI using the\ncommand line:\n\n```\npip install rdtools\n```\n\nFor API documentation and full examples, please see the [documentation](https://rdtools.readthedocs.io).\n\nRdTools currently is tested on Python 3.7+.\n\n## Citing RdTools\n\nTo cite RdTools, please use the following along with the version number\nand the specific DOI coresponding to that version from [Zenodo](https://doi.org/10.5281/zenodo.1210316):\n\n- Michael G. Deceglie, Ambarish Nag, Adam Shinn, Gregory Kimball,\n Daniel Ruth, Dirk Jordan, Jiyang Yan, Kevin Anderson, Kirsten Perry,\n Mark Mikofski, Matthew Muller, Will Vining, and Chris Deline\n RdTools, version {insert version}, Compuer Software,\n https://github.com/NREL/rdtools. DOI:{insert DOI}\n\nThe underlying workflow of RdTools has been published in several places.\nIf you use RdTools in a published work, you may also wish to cite the following as\nappropriate:\n\n- Dirk Jordan, Chris Deline, Sarah Kurtz, Gregory Kimball, Michael Anderson, ""Robust PV\n Degradation Methodology and Application"", IEEE Journal of\n Photovoltaics, 8(2) pp. 525-531, 2018, DOI: [10.1109/JPHOTOV.2017.2779779](https://doi.org/10.1109/JPHOTOV.2017.2779779)\n\n- Michael G. Deceglie, Leonardo Micheli and Matthew Muller, ""Quantifying Soiling Loss\n Directly From PV Yield,"" in IEEE Journal of Photovoltaics, 8(2),\n pp. 547-551, 2018, DOI: [10.1109/JPHOTOV.2017.2784682](https://doi.org/10.1109/JPHOTOV.2017.2784682)\n\n- Kevin Anderson and Ryan Blumenthal, ""Overcoming Communications Outages in\n Inverter Downtime Analysis"", 2020 IEEE 47th Photovoltaic Specialists\n Conference (PVSC)"" DOI: [10.1109/PVSC45281.2020.9300635](https://doi.org/10.1109/PVSC45281.2020.9300635)\n\n- Kirsten Perry, Matthew Muller and Kevin Anderson, ""Performance Comparison of Clipping\n Detection Techniques in AC Power Time Series,"" 2021 IEEE 48th Photovoltaic\n Specialists Conference (PVSC), pp. 1638-1643 2021, DOI: [10.1109/PVSC43889.2021.9518733](https://doi.org/10.1109/PVSC43889.2021.9518733).\n\n \n## References\nThe clear sky temperature calculation, `clearsky_temperature.get_clearsky_tamb()`, uses data\nfrom images created by Jesse Allen, NASA\xe2\x80\x99s Earth Observatory using data courtesy of the MODIS Land Group. \nhttps://neo.sci.gsfc.nasa.gov/view.php?datasetId=MOD_LSTD_CLIM_M \nhttps://neo.sci.gsfc.nasa.gov/view.php?datasetId=MOD_LSTN_CLIM_M\n\nOther useful references which may also be consulted for degradation rate methodology include:\n\n - D. C. Jordan, M. G. Deceglie, S. R. Kurtz, ""PV degradation methodology comparison \xe2\x80\x94 A basis for a standard"", in 43rd IEEE Photovoltaic Specialists Conference, Portland, OR, USA, 2016, DOI: 10.1109/PVSC.2016.7749593.\n - Jordan DC, Kurtz SR, VanSant KT, Newmiller J, Compendium of Photovoltaic Degradation Rates, Progress in Photovoltaics: Research and Application, 2016, 24(7), 978 - 989.\n - D. Jordan, S. Kurtz, PV Degradation Rates \xe2\x80\x93 an Analytical Review, Progress in Photovoltaics: Research and Application, 2013, 21(1), 12 - 29.\n - E. Hasselbrink, M. Anderson, Z. Defreitas, M. Mikofski, Y.-C.Shen, S. Caldwell, A. Terao, D. Kavulak, Z. Campeau, D. DeGraaff, ""Validation of the PVLife model using 3 million module-years of live site data"", 39th IEEE Photovoltaic Specialists Conference, Tampa, FL, USA, 2013, p. 7 \xe2\x80\x93 13, DOI: 10.1109/PVSC.2013.6744087.\n\n## Further Instructions and Updates\n\nCheck out the [wiki](https://github.com/NREL/rdtools/wiki) for additional usage documentation, and for information on development goals and framework.\n\n'",",https://doi.org/10.5281/zenodo.1210316,https://doi.org/10.1109/JPHOTOV.2017.2779779,https://doi.org/10.1109/JPHOTOV.2017.2784682,https://doi.org/10.1109/PVSC45281.2020.9300635,https://doi.org/10.1109/PVSC43889.2021.9518733","2016/11/18, 22:17:01",2531,MIT,67,689,"2023/10/18, 01:55:24",62,228,335,29,7,14,0.8,0.5055555555555555,"2023/07/31, 14:46:51",2.1.6,0,12,false,,true,true,"cai-yutian/pv-modelling-mirror,MORE-EU/hackathon_athena,bhavyadureja25/ds_tool_with_upload,h-quest/FDDA_app,mikofski/PVRW2021,josephmckinsey/NRELHackathon2018,hackingmaterials/duramat_dashboard",,https://github.com/NREL,http://www.nrel.gov,"Golden, CO",,,https://avatars.githubusercontent.com/u/1906800?v=4,,,
Machine-Learning-for-Solar-Energy-Prediction,Predict the power production of a solar panel farm from weather measurements using machine learning.,ColasGael,https://github.com/ColasGael/Machine-Learning-for-Solar-Energy-Prediction.git,github,"machine-learning,neural-network,data-processing,python,matlab,tensorflow",Photovoltaics and Solar Energy,"2019/11/07, 18:37:29",183,0,36,false,Python,,,"Python,MATLAB,R,TeX,TSQL",,"b'# Machine-Learning-for-Solar-Energy-Prediction\nby Adele Kuzmiakova, Gael Colas and Alex McKeehan, graduate students from Stanford University\n\nThis is our final project for the CS229: ""Machine Learning"" class in Stanford (2017). Our teachers were Pr. Andrew Ng and Pr. Dan Boneh.\n\nLanguage: Python, Matlab, R\n\nGoal: predict the hourly power production of a photovoltaic power station from the measurements of a set of weather features. \n\nThis project could be decomposed in 3 parts:\n - Data Pre-processing: we processed the raw weather data files (input) from the National Oceanographic and Atmospheric Administration and the power production data files (output) from Urbana-Champaign solar farm to get meaningful numeric values on an hourly basis ;\n - Feature Selection: we run correlation analysis between the weather features and the energy output to discard useless features, we also implemented Principal Component Analysis to reduce the dimension of our dataset ;\n - Machine Learning : we compared the performances of our ML algorithms. Implemented models include Weighted Linear Regression with and without dimension reduction, Boosting Regression Trees, and artificial Neural Networks with and without vanishing temporal gradient\n\nOur final report and poster are available at the root.\n'",,"2018/05/06, 19:43:04",1997,MIT,0,11,"2019/11/15, 19:01:02",1,0,3,0,1440,0,0,0.0,,,0,1,false,,false,true,,,,,,,,,,,
elpv-dataset,A dataset of functional and defective solar cells extracted from EL images of solar modules.,zae-bayern,https://github.com/zae-bayern/elpv-dataset.git,github,"photovoltaic,solar-energy,solar-cells,machine-learning,computer-vision",Photovoltaics and Solar Energy,"2021/05/24, 21:44:52",166,0,42,true,Python,ZAE Bayern,zae-bayern,Python,,"b'# A Benchmark for Visual Identification of Defective Solar Cells in Electroluminescence Imagery\n\nThis repository provides a dataset of solar cell images extracted from\nhigh-resolution electroluminescence images of photovoltaic modules.\n\n\n\n## The Dataset\n\nThe dataset contains 2,624 samples of 300x300 pixels 8-bit grayscale images of\nfunctional and defective solar cells with varying degree of degradations\nextracted from 44 different solar modules. The defects in the annotated images\nare either of intrinsic or extrinsic type and are known to reduce the power\nefficiency of solar modules.\n\nAll images are normalized with respect to size and perspective.\nAdditionally, any distortion induced by the camera lens used to capture the EL images was\neliminated prior to solar cell extraction.\n\n## Annotations\n\nEvery image is annotated with a defect probability (a floating point value\nbetween 0 and 1) and the type of the solar module (either mono- or\npolycrystalline) the solar cell image was originally extracted from.\n\nThe individual images are stored in the `images` directory and the corresponding\nannotations in `labels.csv`.\n\n## Usage\n\nIn Python, use `utils/elpv_reader` in this repository to load the images and the\ncorresponding annotations as follows:\n\n```python\nfrom elpv_reader import load_dataset\nimages, proba, types = load_dataset()\n```\n\nThe code requires NumPy and Pillow to work correctly.\n\n## Citing\n\nIf you use this dataset in scientific context, please cite the following\npublications:\n\n> Buerhop-Lutz, C.; Deitsch, S.; Maier, A.; Gallwitz, F.; Berger, S.; Doll, B.; Hauch, J.; Camus, C. & Brabec, C. J. A Benchmark for Visual Identification of Defective Solar Cells in Electroluminescence Imagery. European PV Solar Energy Conference and Exhibition (EU PVSEC), 2018. DOI: [10.4229/35thEUPVSEC20182018-5CV.3.15](http://dx.doi.org/10.4229/35thEUPVSEC20182018-5CV.3.15)\n\n> Deitsch, S., Buerhop-Lutz, C., Sovetkin, E., Steland, A., Maier, A., Gallwitz, F., & Riess, C. (2021). Segmentation of photovoltaic module cells in uncalibrated electroluminescence images. Machine Vision and Applications, 32(4). DOI: [10.1007/s00138-021-01191-9](https://doi.org/10.1007/s00138-021-01191-9)\n\n> Deitsch, S.; Christlein, V.; Berger, S.; Buerhop-Lutz, C.; Maier, A.; Gallwitz, F. & Riess, C. Automatic classification of defective photovoltaic module cells in electroluminescence images. Solar Energy, Elsevier BV, 2019, 185, 455-468. DOI: [10.1016/j.solener.2019.02.067](http://dx.doi.org/10.1016/j.solener.2019.02.067)\n\nBibTeX details:\n\n\n\n```bibtex\n\n@InProceedings{Buerhop2018,\n author = {Buerhop-Lutz, Claudia and Deitsch, Sergiu and Maier, Andreas and Gallwitz, Florian and Berger, Stephan and Doll, Bernd and Hauch, Jens and Camus, Christian and Brabec, Christoph J.},\n title = {A Benchmark for Visual Identification of Defective Solar Cells in Electroluminescence Imagery},\n booktitle = {European PV Solar Energy Conference and Exhibition (EU PVSEC)},\n year = {2018},\n eventdate = {2018-09-24/2018-09-28},\n venue = {Brussels, Belgium},\n doi = {10.4229/35thEUPVSEC20182018-5CV.3.15},\n}\n\n@Article{Deitsch2021,\n author = {Deitsch, Sergiu and Buerhop-Lutz, Claudia and Sovetkin, Evgenii and Steland, Ansgar and Maier, Andreas and Gallwitz, Florian and Riess, Christian},\n date = {2021},\n journaltitle = {Machine Vision and Applications},\n title = {Segmentation of photovoltaic module cells in uncalibrated electroluminescence images},\n doi = {10.1007/s00138-021-01191-9},\n issn = {1432-1769},\n number = {4},\n volume = {32},\n}\n\n@Article{Deitsch2019,\n author = {Sergiu Deitsch and Vincent Christlein and Stephan Berger and Claudia Buerhop-Lutz and Andreas Maier and Florian Gallwitz and Christian Riess},\n title = {Automatic classification of defective photovoltaic module cells in electroluminescence images},\n journal = {Solar Energy},\n year = {2019},\n volume = {185},\n pages = {455--468},\n month = jun,\n issn = {0038-092X},\n doi = {10.1016/j.solener.2019.02.067},\n publisher = {Elsevier {BV}},\n}\n```\n \n\n## License\n\n![]()
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.\n\nFor commercial use, please contact us for further information.\n'",",https://doi.org/10.1007/s00138-021-01191-9","2018/03/07, 10:53:56",2058,CUSTOM,0,18,"2022/12/01, 21:22:32",0,0,4,1,327,0,0,0.0,"2021/05/25, 12:15:29",v1.0,0,1,false,,false,true,,,https://github.com/zae-bayern,https://www.zae-bayern.de,,,,https://avatars.githubusercontent.com/u/37119743?v=4,,,
feedinlib,Contains implementations of photovoltaic models to calculate electricity generation from a PV installation based on given solar radiation. Furthermore it contains all necessary pre-calculations.,oemof,https://github.com/oemof/feedinlib.git,github,,Photovoltaics and Solar Energy,"2023/07/30, 04:54:03",79,13,10,true,Python,oemof community,oemof,Python,,"b'========\nOverview\n========\n\n.. start-badges\n\n.. list-table::\n :stub-columns: 1\n\n|workflow_pytests| |workflow_checks| |docs| |appveyor| |requires| |coveralls| |packaging|\n|version| |wheel| |supported-versions| |supported-implementations| |commits-since|\n\n.. |docs| image:: https://readthedocs.org/projects/feedinlib/badge/?style=flat\n :target: https://feedinlib.readthedocs.io/\n :alt: Documentation Status\n\n.. |workflow_pytests| image:: https://github.com/oemof/feedinlib/workflows/tox%20pytests/badge.svg?branch=revision/add-tox-github-workflows-src-directory-ci\n :target: https://github.com/oemof/feedinlib/actions?query=workflow%3A%22tox+pytests%22\n\n.. |workflow_checks| image:: https://github.com/oemof/feedinlib/workflows/tox%20checks/badge.svg?branch=revision/add-tox-github-workflows-src-directory-ci\n :target: https://github.com/oemof/feedinlib/actions?query=workflow%3A%22tox+checks%22\n\n.. |packaging| image:: https://github.com/oemof/feedinlib/workflows/packaging/badge.svg?branch=revision/add-tox-github-workflows-src-directory-ci\n :target: https://github.com/oemof/feedinlib/actions?query=workflow%3Apackaging\n\n.. |appveyor| image:: https://ci.appveyor.com/api/projects/status/github/oemof/feedinlib?branch=master&svg=true\n :alt: AppVeyor Build Status\n :target: https://ci.appveyor.com/project/oemof/feedinlib\n\n.. |requires| image:: https://requires.io/github/oemof/feedinlib/requirements.svg?branch=master\n :alt: Requirements Status\n :target: https://requires.io/github/oemof/feedinlib/requirements/?branch=master\n\n.. |coveralls| image:: https://coveralls.io/repos/oemof/feedinlib/badge.svg?branch=master&service=github\n :alt: Coverage Status\n :target: https://coveralls.io/r/oemof/feedinlib\n\n.. |version| image:: https://img.shields.io/pypi/v/feedinlib.svg\n :alt: PyPI Package latest release\n :target: https://pypi.org/project/feedinlib\n\n.. |wheel| image:: https://img.shields.io/pypi/wheel/feedinlib.svg\n :alt: PyPI Wheel\n :target: https://pypi.org/project/feedinlib\n\n.. |supported-versions| image:: https://img.shields.io/pypi/pyversions/feedinlib.svg\n :alt: Supported versions\n :target: https://pypi.org/project/feedinlib\n\n.. |supported-implementations| image:: https://img.shields.io/pypi/implementation/feedinlib.svg\n :alt: Supported implementations\n :target: https://pypi.org/project/feedinlib\n\n.. |commits-since| image:: https://img.shields.io/github/commits-since/oemof/feedinlib/v0.0.12.svg\n :alt: Commits since latest release\n :target: https://github.com/oemof/feedinlib/compare/v0.0.12...master\n\n\n\n.. end-badges\n\nConnect weather data interfaces with interfaces of wind and pv power models.\n\n* Free software: MIT license\n\nInstallation\n============\n\nOn Linux systems, you can just::\n\n pip install feedinlib\n\nYou can also install the in-development version with::\n\n pip install https://github.com/oemof/feedinlib/archive/master.zip\n \nOn Windows systems, some dependencies are not pip-installable. Thus, Windws\nusers first have to manually install the dependencies e.g. using conda or mamba.\n\n\nDocumentation\n=============\n\n\nhttps://feedinlib.readthedocs.io/\n\n\nDevelopment\n===========\n\nTo run all the tests run::\n\n tox\n\nNote, to combine the coverage data from all the tox environments run:\n\n.. list-table::\n :widths: 10 90\n :stub-columns: 1\n\n - - Windows\n - ::\n\n set PYTEST_ADDOPTS=--cov-append\n tox\n\n - - Other\n - ::\n\n PYTEST_ADDOPTS=--cov-append tox\n'",,"2014/07/24, 09:03:29",3380,MIT,3,507,"2023/07/21, 12:14:55",23,19,53,2,96,5,0.8,0.6883720930232557,"2019/01/31, 14:19:59",v0.0.12,0,11,false,,false,true,"in-RET/inretensys-fastapi,UU-ER/EHUB-Py_Training,oemof/oemof,moritz-reuter/ESEM-EE,rl-institut/OWEFE,dpinney/wiires,SESMG/SESMG,greco-project/greco_technologies,birgits/dgs_tool,open-fred/feedin_germany,rl-institut/WAM_APP_vdi,rl-institut/WAM_APP_stemp_mv,rl-institut/appBBB",,https://github.com/oemof,https://oemof.org,Germany,,,https://avatars.githubusercontent.com/u/8503379?v=4,,,
photovoltaic,A Python library used in photovoltaics.,pvedu,https://github.com/pvedu/photovoltaic.git,github,,Photovoltaics and Solar Energy,"2022/04/25, 22:48:39",44,2,3,false,HTML,,,"HTML,Python,Batchfile",,"b'# photovoltaic\n\n\n**photovoltaic** is a library of python functions used in photovoltaics. Its preferrable to install the library but the functions are simple enough to include in your code.\n\nHelp Index: http://htmlpreview.github.io/?https://github.com/pvedu/photovoltaic/blob/master/html/photovoltaic.html \nCode is at: https://github.com/pvedu/photovoltaic/tree/master/photovoltaic \n\n## Examples\n\nThe best place to start is with the examples at:\nhttps://github.com/pvedu/pvon\n\nThere are instructions on how to run the examples completely within the browser and without installing anything.\n\n## Typical usage\n\n import photovoltaic as pv\n irradiance = pv.sun.blackbody_spectrum(800)\n print(irradiance)\n\nThis would print the blackbody irradiance at 800 nm with the default temperature of 6000 K in W/m2/nm.\n\n\n## Installation\n\nInstallation is via pip from the pypi repositry. From a command propmpt:\n\n pip install photovoltaic\n\nInside a Jupter notebook use:\n\n !pip install photovoltaic\n\nSome systems use pip3 instead of pip. People recommend using a virtual environment, but I don\'t find its necessary on MS Windows.\n\n\t\nTha above command should also install the latest scipy and numpy packages. They can also be installed directly with:\n\n pip install numpy\n\n pip install scipy\n\n## Requirements\n\nKnown to work under plain vanilla Python 3.6 using the standard IDLE editor with Numpy and Scipy installed. The examples also make use of matplotlib. It should also work with the various Python systems such as Anaconda Jupyter etc.\n\n\nAnaconda includes a wealth of scientific packagkes and is available at: https://www.anaconda.com/download/ \n\nStandard Python is at https://www.python.org/downloads/\n\nFor the graphs, Matplotlib is needed in addition to the above numpy and scipy packages:\n\n pip install matplotlib\n\n\n\n\n\n\n## Other\n\n**f** means **from** in some of the function names. For example:\n\nnmfeV() converts the energy of a photon **from** electron volts to a nm.\n\nThis follows the conventions of other python functions such as strfdatetime.\n\n\nThe library is designed to be as simple as possible and an ""algorithm that runs"". While it is easier to install the whole library, it is also straighforward to cut/paste parts of the code.\n\nThere are other python libraries that cover sections of the photovoltaic library in much more detail.\n\n* [pvlib] (https://github.com/pvlib/pvlib-python) covers insolation and systems modeling.\n* [Semiconductors](https://github.com/MK8J) relating to solar.\n'",,"2017/08/29, 23:41:59",2247,GPL-3.0,0,56,"2022/04/25, 22:48:40",2,2,2,0,547,1,0.0,0.4285714285714286,,,0,3,false,,false,false,"NREL/PVDegradationTools,pvedu/pvon",,,,,,,,,,
pvcaptest,Collection of functions and Jupyter Notebooks to partially automate running a capacity test following ASTM E2848.,pvcaptest,https://github.com/pvcaptest/pvcaptest.git,github,,Photovoltaics and Solar Energy,"2023/09/05, 01:50:52",17,0,3,true,Python,,pvcaptest,Python,,"b'# pvcaptest\n\n\n\n\n | Latest Release | \n ![]() | \n
\n\n
\n\n# What is pvcaptest?\npvcaptest is an open source python package created to facilitate capacity testing following the ASTM E2848 standard. The captest module contains a single class, CapData, which provides methods for loading, visualizing, filtering, and regressing capacity testing data. The module also includes functions that take CapData objects as arguments and provide summary data and capacity test results.\n\nDocumentation and examples are available on [readthedocs](https://pvcaptest.readthedocs.io/en/latest/) including full examples in jupyter notebooks that can be run in the browser without installing anything.\n\n# Installation\nThese instructions assume that you are new to using conda and python, if that is not the case skip to the last section for users familiar with conda and pip.\n\nThe recommended method to install pvcaptest is to create a conda environment for pvcaptest. Installing Anaconda or miniconda will install both python and conda. There is no need to install python separately.\n\n**Easiest Option:**\n1. Download and install the [anaconda distribution](https://www.anaconda.com/products/individual). Follow the default installation settings.\n2. On Windows go to the start menu and open the Anaconda prompt under the newly installed Anaconda program. On OSX or Linux open a terminal window.\n3. Install pvcaptest by typing the command `conda install -c conda-forge pvcaptest` and pressing enter. The `-c conda-forge` option tells conda to install pvcaptest from the [conda forge channel](https://conda-forge.org/#about).\n\n\nThis will install the pvcaptest package in the base environment created when Anaconda is installed. This should work and provide you with jupyter notebook and jupyer lab to run pvcaptest in. If you think you will use your Anaconda installation to create and maintain additional environments, the following process for creating a stand alone option is likely a better option.\n\n**Better long term option:**\n1. If you do not already have it installed, download and install the [anaconda distribution](https://www.anaconda.com/products/individual) or miniconda.\n2. Go to the [project github page](https://github.com/pvcaptest/pvcaptest) and download the project source to obtain a copy of the `environment.yml` file. Click the green code button and click \'Download ZIP\'.\n2. On Windows go to the start menu and open the Anaconda prompt under the newly installed Anaconda program. On OSX or Linux open a terminal window. Note the path in the prompt for the next step. On Windows this should be something like `C:\\Users\\username\\`.\n3. Unzip and move the `environment.yml` file to the folder identified by the path from the previous step.\n4. In your Anaconda prompt or terminal type `conda env create -f environment.yml` \nand hit enter. Wait for a few seconds while conda works to solve the environment. It should ask you if you want to proceed to install new packages including pvcaptest. Type `y` enter to proceed and wait for conda to finish installing pvcaptest and the other packages.\n5. Once the installation is complete conda will print out a command for activating the new environment. Run that command, which should be like `conda activate captest_env`.\n\nThe environment created will include jupyter lab and notebook for you to use pvcaptest in. You can start these using the commands `jupyter lab` or `jupyter notebook`.\n\n\nSee the [conda documentation](https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html#creating-an-environment-from-an-environment-yml-file) for more details on using conda to create and manage environments.\n\n\n### Install for users familiar with conda and pip:\nConda install into an existing environment:\n\n`conda install -c conda-forge pvcaptest`\n\nIf you prefer, you can pip install pvcaptest, but the recommended approach is to use the conda package.\n\n**Note: The conda package is named pvcaptest and the pip package is named captest. The project is moving to consistent use of the pvcaptest name, but the package name on pypi will remain as captest.**\n'",,"2017/09/19, 01:33:23",2227,MIT,213,881,"2023/09/05, 01:53:52",22,35,76,31,50,0,0.3,0.015706806282722474,"2023/09/10, 18:59:38",v0.10.0,0,3,false,,false,false,,,https://github.com/pvcaptest,,,,,https://avatars.githubusercontent.com/u/53880406?v=4,,,
pvtrace,Optical ray tracing for luminescent materials and spectral converter photovoltaic devices.,danieljfarrell,https://github.com/danieljfarrell/pvtrace.git,github,"python,photovoltaics,raytracing,optics,energy",Photovoltaics and Solar Energy,"2021/03/30, 17:10:12",91,2,8,true,Python,,,Python,,"b'[](https://doi.org/10.5281/zenodo.592982)\n\n\n\n> Optical ray tracing for luminescent materials and spectral converter photovoltaic devices\n\n# Ray-tracing luminescent solar concentrators\n\n*pvtrace* is a statistical photon path tracer written in Python. Rays are followed through a 3D scene and their interactions with objects are recorded to build up statistical information about energy flow.\n\nThis is useful in photovoltaics and non-imaging optics where the goal is to design systems which efficiently transport light to target locations. \n\nOne of its key features is the ability to simulate re-absorption in luminescent materials. For example, like in devices like Luminescent Solar Concentrators (LSCs).\n\nA basic LSC can be simulated and visualised in five lines of code,\n\n```python\nfrom pvtrace import *\nlsc = LSC((5.0, 5.0, 1.0)) # size in cm\nlsc.show() # open visualiser\nlsc.simulate(100) # emit 100 rays\nlsc.report() # print report\n```\n\nThis script will render the ray-tracing in real time,\n\n\n\npvtrace has been validate against three other luminescent concentrator codes. For full details see [Validation.ipynb](https://github.com/danieljfarrell/pvtrace/blob/master/examples/Validation.ipynb) notebook\n\n\n\n# Install\n\n## MacOS using pyenv\n\nOn MacOS *pvtrace* can be installed easily using [pyenv](https://github.com/pyenv/pyenv), the `pip` command and [homebrew](https://brew.sh). First install [homebrew](https://brew.sh), then install `spatialindex` for the RTree dependency,\n\n brew install spatialindex\n\nNext, create a clean virtual environment for pvtrace\n\n pyenv install 3.7.8\n pyenv virtualenv 3.7.8 pvtrace-env\n pyenv activate pvtrace-env\n pip install pvtrace\n\n## Linux and Windows using Conda\n\nOn Linux and Windows you must use conda to create the python environment. Optionally you can also use this method on MacOS too if you prefer conda over pyenv.\n\n conda create --name pvtrace-env python=3.7.8\n conda activate pvtrace-env\n conda install Rtree\n pip install pvtrace\n\n# Run the example script and notebooks\n\nDownload the [hello_world.py](https://raw.githubusercontent.com/danieljfarrell/pvtrace/master/examples/hello_world.py) example script either manually or using `curl`,\n\n # Download example script\n curl https://raw.githubusercontent.com/danieljfarrell/pvtrace/master/examples/hello_world.py > hello_world.py\n\nNow active your python environment! \n\nIf you installed using **pyenv** do the following,\n\n pyenv local pvtrace-env\n\nIf you are using **conda** to this,\n\n conda activate pvtrace-env\n\nNow start the meshcat server with the command,\n\n meshcat-server\n\nThis will print information like,\n\n zmq_url=tcp://127.0.0.1:6000\n web_url=http://127.0.0.1:7000/static/\n\nOpen a new terminal window and again activate your pvtrace-env.\n\nOpen `hello_world.py` and make sure the line below has `zmq_url` of your meshcat-server,\n\n # Change zmq_url here to be the address of your meshcat-server!\n renderer = MeshcatRenderer(\n zmq_url=""tcp://127.0.0.1:6000"", wireframe=True, open_browser=True\n ) \n\nYou can now run pvtrace scripts! Run this following command,\n\n python hello_world.py\n\nAlso take a look at the online Jupyter notebook tutorial series which provide an overview of pvtrace and examples,\n\n 1. [Quick Start.ipynb](https://github.com/danieljfarrell/pvtrace/blob/master/examples/001%20Quick%20Start.ipynb), an interactive ray-tracing tutorial (download an run locally)\n 2. [Materials.ipynb](https://github.com/danieljfarrell/pvtrace/blob/master/examples/002%20Materials.ipynb), include physical properties with materials\n 3. [Lights.ipynb](https://github.com/danieljfarrell/pvtrace/blob/master/examples/003%20Lights.ipynb), place photon sources in the scene and customise their properties\n 4. [Nodes.ipynb](https://github.com/danieljfarrell/pvtrace/blob/master/examples/004%20Nodes.ipynb) translate and rotate scene objects with nodes\n 5. [Geometry.ipynb](https://github.com/danieljfarrell/pvtrace/blob/master/examples/005%20Geometry.ipynb) define the shapes of objects in your scene\n 6. [Coatings.ipynb](https://github.com/danieljfarrell/pvtrace/blob/master/examples/006%20Coatings.ipynb) introduce custom reflections with coatings\n\nDownload and run these notebooks locally for a more interactive experience, but first install jupyter,\n\n pip install jupyter\n\nor with conda,\n\n conda install jupyter\n\nThen launch the jupyter notebook,\n\n jupyter notebook\n\n# Features\n\n## Ray optics simulations\n\n*pvtrace* supports 3D ray optics simulations shapes,\n\n* box\n* sphere\n* cylinder\n* mesh\n\nThe optical properties of each shape can be customised,\n\n* refractive index\n* absorption coefficient\n* scattering coefficient\n* emission lineshape\n* quantum yield\n* surface reflection\n* surface scattering\n\n\n\n## High and low-level API\n\n*pvtrace* has a high-level API for handling common problems with LSCs and a low-level API where objects can be positioned in a 3D scene and optical properties customised.\n\nFor example, a script using the low-level API to ray trace this glass sphere is below,\n\n```python\nimport time\nimport sys\nimport functools\nimport numpy as np\nfrom pvtrace import *\n\n# World node contains all objects\nworld = Node(\n name=""world (air)"",\n geometry=Sphere(\n radius=10.0,\n material=Material(refractive_index=1.0),\n )\n)\n\n# The glass sphere\nsphere = Node(\n name=""sphere (glass)"",\n geometry=Sphere(\n radius=1.0,\n material=Material(refractive_index=1.5),\n ),\n parent=world\n)\nsphere.location = (0, 0, 2)\n\n# The source of rays\nlight = Node(\n name=""Light (555nm)"",\n light=Light(direction=functools.partial(cone, np.pi/8)),\n parent=world\n)\n\n# Render and ray-trace\nrenderer = MeshcatRenderer(wireframe=True, open_browser=True)\nscene = Scene(world)\nrenderer.render(scene)\nfor ray in scene.emit(100):\n steps = photon_tracer.follow(scene, ray)\n path, events = zip(*steps)\n renderer.add_ray_path(path)\n time.sleep(0.1)\n\n# Wait for Ctrl-C to terminate the script; keep the window open\nprint(""Ctrl-C to close"")\nwhile True:\n try:\n time.sleep(.3)\n except KeyboardInterrupt:\n sys.exit()\n```\n\n## Scene Graph\n\n*pvtrace* is designed in layers each with as limited scope as possible.\n\n\n\n\n - Scene
\n - Graph data structure of node and the thing that is ray-traced.
\n \n - Node
\n - Provides a coordinate system, can be nested inside one another, perform arbitrary rotation and translation transformations.
\n \n - Geometry
\n - Attached to nodes to define different shapes (Sphere, Box, Cylinder, Mesh) and handles all ray intersections.
\n \n - Material
\n - Attached to geometry objects to assign physical properties to shapes such as refractive index.
\n \n - Surface
\n - Handles details of interaction between material surfaces and a customisation point for simulation of wavelength selective coatings.
\n \n - Components
\n - Specifies optical properties of the geometries volume, absorption coefficient, scattering coefficient, quantum yield, emission spectrum.
\n \n - Ray-tracing engine
\n - The algorithm which spawns rays, computes intersections, samples probabilities and traverses the rays through the scene.
\n
\n\n## Ray-tracing engine\n\nCurrently *pvtrace* supports only one ray-tracing engine: a photon path tracer. This is physically accurate, down to treating individual absorption and emission events, but is slow because the problem cannot be vectorised as each ray is followed individually.\n\n# Documentation\n\nInteractive Jupyter notebooks are in [examples directory](https://github.com/danieljfarrell/pvtrace/tree/master/examples), download and take a look, although they can be viewed online.\n\n# Contributing\n\nPlease use the github [issue](https://github.com/danieljfarrell/pvtrace/issues) tracker for bug fixes, suggestions, or support questions.\n\nIf you are considering contributing to pvtrace, first fork the project. This will make it easier to include your contributions using pull requests.\n\n## Creating a development environment\n\n1. First create a new development environment using [MacOS instructions](#macos-using-pyenv) or [Linux and Windows instructions](#linux-and-windows-using-conda), but do not install pvtrace using pip! You will need to clone your own copy of the source code in the following steps.\n2. Use the GitHub fork button to make your own fork of the project. This will make it easy to include your changes in pvtrace using a pull request.\n3. Follow the steps below to clone and install the development dependencies\n\n```bash\n# Pull from your fork\ngit clone https://github.com//pvtrace.git\n\n# Get development dependencies\npip install -r pvtrace/requirements_dev.txt \n\n# Add local `pvtrace` directory to known packages\npip install -e pvtrace\n\n# Run units tests\npytest pvtrace/tests\n\n# Run an example\npython pvtrace/examples/hello_world.py\n```\n\nYou should now be able to edit the source code and simply run scripts directly without the need to reinstall anything.\n\n## Unit tests\n\nPlease add or modify an existing unit tests in the `pvtrace/tests` directory if you are adding new code. This will make it much easier to include your changes in the project.\n\n## Pull requests\n\nPull requests will be considered. Please make contact before doing a lot of work, to make sure that the changes will definitely be included in the main project.\n\n# Questions\n\nYou can get in contact with me directly at dan@excitonlabs.com or raise an issue on the issue tracker.\n\n# Dependencies\n\nBasic environment requires the following packages which will be installed with `pip` automatically\n\n* python >= 3.7.2\n* numpy\n* pandas\n* trimesh[easy]\n* meshcat >= 0.0.16\n* anytree\n'",",https://doi.org/10.5281/zenodo.592982","2011/01/05, 14:28:41",4676,CUSTOM,0,185,"2023/05/31, 10:41:40",8,25,48,4,147,0,0.5,0.025316455696202556,"2020/11/18, 23:04:23",2.1.6,0,5,false,,false,false,"dcambie/LSC-PM_solar_miniplant,danieljfarrell/pvtrace",,,,,,,,,,
SolarPILOT,Solar power tower layout and optimization tool.,NREL,https://github.com/NREL/SolarPILOT.git,github,,Photovoltaics and Solar Energy,"2023/06/28, 21:42:49",37,0,9,true,C++,National Renewable Energy Laboratory,NREL,"C++,HTML,Python,Rich Text Format,CSS,C,CMake,Inno Setup,JavaScript,TeX,Makefile,Shell",https://www.nrel.gov/csp/solarpilot.html,"b'# SolarPILOT - Solar Power tower Integrated Layout and Optimization Tool\n\n[](https://travis-ci.org/NREL/SolarPILOT)\n\nThe SolarPILOT Open Source Project repository contains the source code, tools, and instructions to build a desktop version of the National Renewable Energy Laboratory\'s SolarPILOT. SolarPILOT is a design, characterization, and optimization tool for concentrating solar power (CSP) tower plants. It is available through this repository as a standalone application with full functionality, and it is also included in several CSP tower models within NREL\'s System Advisor Model (SAM) in limited form. For more details about SolarPILOT\'s capabilities, see the SolarPILOT website at [https://www.nrel.gov/csp/solarpilot.html](https://www.nrel.gov/csp/solarpilot.html). For details on integration with SAM, see the SAM website at [sam.nrel.gov](https://sam.nrel.gov).\n\nThe desktop version of SolarPILOT for Windows or Linux builds from the following open source projects:\n\n* [SolTrace](https://github.com/nrel/soltrace) is a general tool for Monte-Carlo ray tracing that allows optical characterization of a wide range of possible systems. The tool is used by SolarPILOT alongside the analytical Hermite polynomial expansion model to ensure model accuracy.\n\n* [SSC](https://github.com/mjwagner2/ssc/tree/solarpilot-develop) is a set of ""compute modules"" that simulate different kinds of power systems and financial structures. It can be run directly using the [SAM Software Development Kit](https://sam.nrel.gov/sdk). **If you are looking for the algorithms underlying the models, they are located in this repository. For a list of SSC release versions that correspond with SolarPILOT GUI releases, see the release list and tags at [Mike Wagner\'s personal GitHub page](https://github.com/mjwagner2/ssc/releases).**\n\n* [LK](https://github.com/nrel/lk) is a scripting language that is integrated into SAM and allows users to add functionality to the program.\n\n* [wxWidgets](https://www.wxwidgets.org/) is a cross-platform graphical user interface platform used for SAM\'s user interface, and for the development tools included with SSC (SDKtool) and LK (LKscript). The current version of SAM uses wxWidgets 3.1.1.\n\n* [WEX](https://github.com/nrel/wex) is a set of extensions to wxWidgets for custom user-interface elements used by SAM, and by LKscript and DView, which are integrated into SAM.\n\n* [Google Test](https://github.com/google/googletest) is a C++ test framework that enables comprehensive unit-testing of software. Contributions to the project will eventually be required to have associated unit tests written in this framework.\n\n* This repository, **SolarPILOT**, provides the user interface to assign values to inputs of the computational modules, run the modules in the correct order, and display calculation results. It also includes tools for editing LK scripts, viewing field layout and receiver flux map data, and performing multi-dimensional system optimization.\n\n## Quick Steps for Building SolarPILOT\n\nFor detailed build instructions see the [wiki](https://github.com/NREL/SolarPILOT/wiki), with specific instructions for:\n\n* [Windows](https://github.com/NREL/SolarPILOT/wiki/build-windows)\n* [Linux](https://github.com/NREL/SolarPILOT/wiki/build-linux)\n\nThese are the general quick steps you need to follow to set up your computer for developing SolarPILOT:\n\n1. Set up your development tools:\n\n * Windows: Visual Studio 2017 Community or other editions available [here](https://www.visualstudio.com/).\n * Linux: g++ compiler available [here](http://www.cprogramming.com/g++.html) or as part of the Linux distribution.\n\n2. Download the [wxWidgets 3.1.1 source code](https://www.wxwidgets.org/downloads/) for your operating system.\n\n3. Build wxWidgets.\n\n4. In Windows, create the WXMSW3 environment variable on your computer to point to the wxWidgets installation folder, or Linux, create the dynamic link `/usr//local/bin/wx-config-3` to point to `/path/to/wxWidgets/bin/wx-config`.\n\n5. As you did for wxWidgets, for each of the following projects, clone (download) the repository, build the project, and then (Windows only) create an environment variable pointing to the project folder. Build the projects in the following order, and assign the environment variable for each project before you build the next one:\n\n\n| Project | Repository URL | Windows Environment Variable |
\n| wxWidgets | https://www.wxwidgets.org/downloads | WXMSW3 |
\n| LK | https://github.com/NREL/lk | LKDIR |
\n| WEX | https://github.com/NREL/wex | WEXDIR |
\n| Google Test | https://github.com/google/googletest | GTEST |
\n| SSC | https://github.com/mjwagner2/ssc, -b solarpilot-develop | SSCDIR |
\n| SolTrace | https://github.com/NREL/SolTrace | CORETRACEDIR |
\n| SolarPILOT | https://github.com/NREL/SolarPILOT | |
\n
\n\n## Contributing\n\nIf you would like to report an issue with SolarPILOT or make a feature request, please let us know by adding a new issue on the [issues page](https://github.com/NREL/SolarPILOT/issues).\n\nIf you would like to submit code to fix an issue or add a feature, you can use GitHub to do so. Please see [Contributing](CONTRIBUTING.md) for instructions.\n\n## License\n\nSolarPILOT\'s open source code is copyrighted by the Alliance for Sustainable Energy and licensed under a [mixed MIT and GPLv3 license](LICENSE.md). It allows for-profit and not-for-profit organizations to develop and redistribute software based on SolarPILOT under terms of an MIT license and requires that research entities including national laboratories, colleges and universities, and non-profit organizations make the source code of any redistribution publicly available under terms of a GPLv3 license.\n\n## Citing SolarPILOT\n\nIf you find SolarPILOT useful, we ask that you appropriately cite it in documentation of your work. We provide the open-source code and executable distributions for free, but find value in being acknowledged in work that advances scientific knowledge and engineering technology. For general use of SolarPILOT, the preferred citation is:\n\n> Wagner, M.J., Wendelin, T. (2018). ""SolarPILOT: A power tower solar field layout and characterization tool"", _Solar Energy_, Vol. 171, pp. 185-196, ISSN 0038-092X, [https://doi.org/10.1016/j.solener.2018.06.063](https://doi.org/10.1016/j.solener.2018.06.063).\n\nThe work is also presented in the following publication:\n\n> Wagner, M.J., Braun, R.J., Newman, A.M. (2017). ""Optimization of stored energy dispatch for concentrating solar power systems."" Doctoral Thesis. Colorado School of Mines, Golden, Colorado. Chapter II, pp. 19-45. URL: [https://dspace.library.colostate.edu/handle/11124/171000](https://dspace.library.colostate.edu/handle/11124/171000).\n\nFor work that builds substantially upon or is derived from the open source project, the preferred citation is:\n\n> Wagner, M.J. (2018). ""SolarPILOT Open-Source Software Project: [github.com/NREL/SolarPILOT](https://github.com/NREL/SolarPILOT)."" Accessed _(dd/mm/yyyy)_. National Renewable Energy Laboratory, Golden, Colorado.\n'",",https://doi.org/10.1016/j.solener.2018.06.063,https://doi.org/10.1016/j.solener.2018.06.063","2017/03/24, 18:22:25",2406,CUSTOM,11,307,"2022/08/08, 19:28:42",16,39,68,0,443,3,0.0,0.07749077490774903,"2019/01/03, 19:40:08",v1.3.2,0,2,false,,false,true,,,https://github.com/NREL,http://www.nrel.gov,"Golden, CO",,,https://avatars.githubusercontent.com/u/1906800?v=4,,,
solar-data-tools,Data analysis tools for working with historical PV solar time-series data sets.,slacgismo,https://github.com/slacgismo/solar-data-tools.git,github,,Photovoltaics and Solar Energy,"2023/09/27, 00:01:47",39,11,12,true,Jupyter Notebook,SLAC GISMo,slacgismo,"Jupyter Notebook,Python,Shell",,"b'# solar-data-tools\n\n\n\n | Latest Release | \n \n \n \n \n \n \n \n \n \n \n |
\n\n | License | \n \n \n \n \n | \n
\n\n | Build Status | \n \n \n \n \n \n \n \n \n \n \n \n | \n
\n\n | Code Quality | \n \n \n \n \n \n \n \n | \n
\n\n | Publications | \n \n \n \n \n | \n
\n\n | PyPI Downloads | \n \n \n \n \n | \n
\n\n | Conda Downloads | \n \n \n \n \n | \n
\n\n | Test-Coverage | \n \n \n | \n
\n
\n\nTools for performing common tasks on solar PV data signals. These tasks include finding clear days in\na data set, common data transforms, and fixing time stamp issues. These tools are designed to be\nautomatic and require little if any input from the user. Libraries are included to help with data IO\nand plotting as well.\n\nThere is close integration between this repository and the [Statistical Clear Sky](https://github.com/slacgismo/StatisticalClearSky) repository, which provides a ""clear sky model"" of system output, given only measured power as an input.\n\nSee [notebooks](/notebooks) folder for examples.\n\n## Install & Setup\n\n### 3 ways of setting up, either approach works:\n\n#### 1) Recommended: Set up `conda` environment with provided `.yml` file\n\nWe recommend setting up a fresh Python virtual environment in which to use `solar-data-tools`. We recommend using the [Conda](https://docs.conda.io/projects/conda/en/latest/index.html) package management system, and creating an environment with the environment configuration file named `pvi-user.yml`, provided in the top level of this repository. This will install the `statistical-clear-sky` package as well.\n\nCreating the env:\n\n```bash\n$ conda env create -f pvi-user.yml\n```\n\nStarting the env:\n\n```bash\n$ conda activate pvi_user\n```\n\nStopping the env\n\n```bash\n$ conda deactivate\n```\n\nUpdating the env with latest\n\n```bash\n$ conda env update -f pvi-user.yml\n```\n\nAdditional documentation on setting up the Conda environment is available [here](https://github.com/slacgismo/pvinsight-onboarding/blob/main/README.md).\n\n\n#### 2) PIP Package\n\n```sh\n$ pip install solar-data-tools\n```\n\nAlternative: Clone repo from GitHub\n\nMimic the pip package by setting up locally.\n\n```bash\n$ pip install -e path/to/root/folder\n```\n\n#### 3) Anaconda Package\n\n```sh\n$ conda install -c slacgismo solar-data-tools\n```\n\n### Solvers\n\n#### ECOS\n\nBy default, ECOS solver is used, which is supported by cvxpy because it is Open Source.\nHowever, it is found that Mosek solver is more stable. Thus, we encourage you to install it separately as below and obtain the license on your own.\n\n#### MOSEK\n\n MOSEK is a commercial software package. The included YAML file will install MOSEK for you, but you will still need to obtain a license. More information is available here:\n\n* [mosek](https://www.mosek.com/resources/getting-started/)\n* [Free 30-day trial](https://www.mosek.com/products/trial/)\n* [Personal academic license](https://www.mosek.com/products/academic-licenses/)\n\n## Usage\n\nUsers will primarily interact with this software through the `DataHandler` class.\n\n```python\nfrom solardatatools import DataHandler\nfrom solardatatools.dataio import get_pvdaq_data\n\npv_system_data = get_pvdaq_data(sysid=35, api_key=\'DEMO_KEY\', year=[2011, 2012, 2013])\n\ndh = DataHandler(pv_system_data)\ndh.run_pipeline(power_col=\'dc_power\')\n```\nIf everything is working correctly, you should see something like the following\n\n```\ntotal time: 16.67 seconds\n--------------------------------\nBreakdown\n--------------------------------\nPreprocessing 6.52s\nCleaning 8.62s\nFiltering/Summarizing 1.53s\n Data quality 0.23s\n Clear day detect 0.19s\n Clipping detect 0.21s\n Capacity change detect 0.91s\n```\n\n## Contributors\n\nMust enable pre-commit hook before pushing any contributions\n```\npip install pre-commit\npre-commit install\n```\n\nRun pre-commit hook on all files\n```\npre-commit run --all-files\n```\n\n## Test Coverage\n\nIn order to view the current test coverage metrics, run:\n```\ncoverage run --source solardatatools -m unittest discover && coverage html\nopen htmlcov/index.html\n```\n\n## Versioning\n\nWe use [Semantic Versioning](http://semver.org/) for versioning. For the versions available, see the [tags on this repository](https://github.com/slacgismo/solar-data-tools/tags).\n\n## Authors\n\n* **Bennet Meyers** - *Initial work and Main research work* - [Bennet Meyers GitHub](https://github.com/bmeyers)\n\nSee also the list of [contributors](https://github.com/bmeyers/solar-data-tools/contributors) who participated in this project.\n\n## License\n\nThis project is licensed under the BSD 2-Clause License - see the [LICENSE](LICENSE) file for details\n'",",https://zenodo.org/badge/latestdoi/171066536","2019/02/17, 00:28:25",1711,BSD-2-Clause,269,1080,"2023/09/26, 19:03:30",7,86,98,19,29,5,0.9,0.45444685466377444,"2023/09/27, 00:04:04",v1.0.1,0,9,false,,false,true,"MichaelHopwood/ForwardForwardOneclass,slaclab/neural-representation-sqw,slacgismo/sg2t,ChrisKre/Photovoltaik_GAN,slacgismo/pv-apache-beam,slacgismo/gismo-cloud-deploy,slacgismo/pv-system-profiler,DuraMAT/pvpro,slacgismo/solar-data-pipeline,slacgismo/StatisticalClearSky,tadatoshi/StatisticalClearSky",,https://github.com/slacgismo,https://gismo.slac.stanford.edu/,"SLAC National Accelerator Laboratory, Menlo Park, CA 94025",,,https://avatars.githubusercontent.com/u/19895500?v=4,,,
SolarPV-DER-simulation-utility,Allows user to run dynamics simulations for solar photovoltaic distributed energy resource connected to a stiff voltage source or to an external program.,tdcosim,https://github.com/tdcosim/SolarPV-DER-simulation-tool.git,github,,Photovoltaics and Solar Energy,"2023/09/29, 01:15:58",31,2,6,true,Python,TDcoSim Team,tdcosim,"Python,Dockerfile",,"b'**Status:** Expect regular updates and bug fixes.\n\n# Tool for simulating dynamics of PV-DER\n[](https://travis-ci.org/sibyjackgrove/SolarPV-DER-simulation-utility)\n\n[](https://www.codefactor.io/repository/github/tdcosim/solarpv-der-simulation-utility)\n\nSolar photovoltaic distributed energy resources (PV-DER) are power electronic inverter based generation (IBG) connected to the electric power distribution system (eg. roof top solar PV systems). This tool can be used to simulate the dynamics of a single DER connected to a stiff voltage source as shown in the following schematic:\n\n\n\n## Basics\nThe dynamics of the DER are modelled using dynamic phasors. Detailed description of the concepts behind this tool can be found in the IEEE publication [Dynamic Modeling of Solar PV Systems for Distribution System Stability Analysis](https://www.researchgate.net/publication/333985171_Dynamic_Modeling_of_Solar_PV_Systems_for_Distribution_System_Stability_Analysis) and detailed list of equations can be found in the [Model specification document.](docs/PV_DER_model_specification_rev3.docx)\n\n### Features\nThe following features are available currently:\n1. Single phase, three phase balanced, and three phase unbalanced (phase voltages may be unbalanced) DER models.\n2. Run simulation in stand alone mode with internal grid voltage source (stiff) model.\n3. Run simulation in loop mode where grid voltage is supplied every time step by a third party program.\n4. Customize all aspects of the model through a [JSON](config_der.json) file which provides access to parameters in all the model components. \n5. Visualize or retrieve simulation results for voltages, current, active, and reactive power.\n5. Introduce solar insolation events (in all modes), grid voltage, and frequency change events (in stand alone mode).\n6. Retrieve and modify model parameters from a third party program.\n7. Following smart inverter features are available: Low/High voltage ride through (LVRT/HVRT), Low frequency ride through (LFRT), and Volt-VAR control logic.\n\n## Links\n* Source code repository: https://github.com/sibyjackgrove/SolarPV-DER-simulation-tool\n* API Documentation: https://solarpv-der-simulation-utility.readthedocs.io/en/latest/\n* Additional documentation: [Description of attributes and methods](docs/PVDER_flags_variables_methods.md)\n\n## Installation\n\nDependencies:\n\n- SciPy >= 1.2.1\n- Numpy >= 1.16.2\n- Matlplotlib >= 3.0.3\n\nInstall latest release:\n```\npip install pvder\n```\n\nInstall from source:\n```\ngit clone https://github.com/tdcosim/SolarPV-DER-simulation-tool.git\ncd SolarPV-DER-simulation-tool\npip install -e .\n```\n\n## Use cases\nFollowing projects are using Solar PV-DER simulation tool:\n1. [Argonne Transmission and Distribution systems Co-Simulation tool (TDcoSim)](https://github.com/tdcosim/TDcoSim)\n2. [OpenAI Gym Distributed Energy Resource Environment (Gym-DER)](https://github.com/sibyjackgrove/gym-SolarPVDER-environment)\n\n## Using the tool\nThis tool can be imported as a normal python module:\n\n```python\nimport pvder\n```\n\n### Using the stand alone single phase DER model with 10 kW power rating\nThe following steps are required. Additional documentation on attributes and methods are available [here](docs/PVDER_flags_variables_methods.md).\n1. First import the following classes:\n```\nfrom pvder.DER_components_single_phase import SolarPV_DER_SinglePhase\nfrom pvder.grid_components import Grid\nfrom pvder.dynamic_simulation import DynamicSimulation\nfrom pvder.simulation_events import SimulationEvents\nfrom pvder.simulation_utilities import SimulationResults\n```\n1. Create a **_SimulationEvents_** object: This object is used to add or remove disturbance events occurs during the simulation.\n```\nevents = SimulationEvents()\n```\n2. Create a **Grid** object: This object describes the steady state model for the grid voltage source. It needs to be supplied with an **_SimulationEvents_** object.\n```\ngrid = Grid(events=events)\n```\n3. Create a **SolarPV_DER_SinglePhase** or **SolarPV_DER_ThreePhase** object: This object describes the dynamic DER model. It needs both an **_SimulationEvents_** object, and a path name for JSON file containing the DER configuration parameters. It also needs a **Grid** object in stand alone mode). Additionaly either the power rating of the DER or the id for the parameter dictionary should be provided.\n```\nPV_DER = SolarPV_DER_SinglePhase(events=events,configFile=r\'config_der.json\',gridModel=grid,derId= \'10\',standAlone = True)\n```\n4. Create a **DynamicSimulation** object: This object runs the simulation and stores the solution. It takes **_SimulationEvents_**, **Grid** and, **SolarPV_DER_SinglePhase** objects as arguments.\n```\nsim = DynamicSimulation(grid_model=grid,PV_model=PV_DER,events = events)\n```\n5. Create a **SimulationResults** object: This object is used to visualize the simulation results.\n```\nresults = SimulationResults(simulation = sim)\n```\n6. Add an event (for e.g. solar insolation change at 10.0 s):\n```\nevents.add_solar_event(10,90)\n```\n7. Specify simulation flags (for e.g. set the DEBUG_SIMULATION and DEBUG_POWER flag to true to observe the power at each time step.):\n```\nsim.DEBUG_SIMULATION = False\nsim.DEBUG_POWER = False\n```\n8. Specify simulation stop time (for e.g. 20.0 s):\n```\nsim.tStop = 20.0\n```\n9. Run the simulation:\n```\nsim.run_simulation()\n```\n10. Visualize the results (for e.g. the power output at PCC-LV side):\n```\nresults.PER_UNIT = False\nresults.plot_DER_simulation(plot_type=\'active_power_Ppv_Pac_PCC\')\n```\n\n### Examples\nTry out Jupyter notebooks with usage examples in Google Colab:\n\nBasic usage:\n[](https://colab.research.google.com/github/sibyjackgrove/SolarPV-DER-simulation-tool/blob/master/examples/PV-DER_usage_example.ipynb)\n\nRunning simulation in loop mode:\n[](https://github.com/sibyjackgrove/SolarPV-DER-simulation-tool/blob/master/examples/PV-DER_usage_example_loop_mode.ipynb)\n\nUpdating model parameters:\n[](https://github.com/sibyjackgrove/SolarPV-DER-simulation-tool/blob/master/examples/PV-DER_parameter_update_example.ipynb)\n\nVoltage anomaly, ride through, and momentary cessation:\n[](https://github.com/sibyjackgrove/SolarPV-DER-simulation-tool/blob/master/examples/PV-DER_usage_example_LVRT_momentary_cessation_with_recovery.ipynb)\n\nFrequency anomaly, ride through, and trip:\n[](https://github.com/sibyjackgrove/SolarPV-DER-simulation-tool/blob/master/examples/PV-DER_usage_example_LFRT_with_trip.ipynb)\n\n## Module details\nA schematic of the relationship between differen classes in the module is shown in the figure below:\n\n\n## Issues\nPlease feel free to raise an issue for bugs or feature requests.\n\n## Who is responsible?\n\n**Core developer:**\n- Siby Jose Plathottam splathottam@anl.gov\n\n**Contributor:**\n\n- Karthikeyan Balasubramaniam kbalasubramaniam@anl.gov\n\n## Acknowledgement\n\nThis project was supported by Kemal Celik, [U.S. DOE Office of Electricity, Solar Energy Technology Office](https://www.energy.gov/eere/solar/solar-energy-technologies-office) through the [SuNLaMP](https://www.energy.gov/eere/solar/sunshot-national-laboratory-multiyear-partnership-sunlamp) program.\n\nThe authors would like to acknowledge [Shrirang Abhyankar](https://github.com/abhyshr) and Puspal Hazra for their contribution.\n\n## Citation\nIf you use this code please cite it as:\n```\n@misc{pvder,\n title = {{SolarPV-DER-simulation-tool}: A simulation tool for or solar photovoltaic distributed energy resources},\n author = ""{Siby Jose Plathottam,Karthikeyan Balasubramaniam}"",\n howpublished = {\\url{https://github.com/sibyjackgrove/SolarPV-DER-simulation-tool}},\n url = ""https://github.com/sibyjackgrove/SolarPV-DER-simulation-tool"",\n year = 2019,\n note = ""[Online; accessed 19-March-2019]""\n}\n```\n### Copyright and License\nCopyright \xc2\xa9 2019, UChicago Argonne, LLC\n\nPhotovoltaic Distributed Energy Resource (PV-DER) Simulation tool is distributed under the terms of [BSD-3 OSS License.](LICENSE)\n'",,"2019/03/19, 21:59:40",1680,CUSTOM,9,225,"2023/09/29, 01:15:59",2,21,35,3,26,0,0.0,0.013245033112582738,,,0,2,false,,false,false,"NREL/PyDSS,tdcosim/TDcoSim",,https://github.com/tdcosim,,Chicago,,,https://avatars.githubusercontent.com/u/52003368?v=4,,,
bifacial_radiance,Toolkit for working with RADIANCE for the ray-trace modeling of Bifacial Photovoltaics.,NREL,https://github.com/NREL/bifacial_radiance.git,github,"radiance,bifacial,photovoltaics,renewable-energy,renewables,gui",Photovoltaics and Solar Energy,"2023/07/18, 15:01:55",70,0,11,true,Python,National Renewable Energy Laboratory,NREL,"Python,C++,TeX,Dockerfile,C,Shell,Makefile,Awk",https://bifacial-radiance.readthedocs.io,"b""\n\n# bifacial_radiance\nMain branch: [](https://github.com/nrel/bifacial_radiance/actions)\n[](https://coveralls.io/github/NREL/bifacial_radiance?branch=main)\n[](https://bifacial-radiance.readthedocs.io/en/latest/?badge=stable)\n[](https://doi.org/10.5281/zenodo.3860350)\n[](https://joss.theoj.org/papers/b018890e2ab7ddf723d37b17e308e273)\n\nDevelopment branch: [](https://github.com/nrel/bifacial_radiance/actions)\n[](https://coveralls.io/github/NREL/bifacial_radiance?branch=development)\n[](https://bifacial-radiance.readthedocs.io/en/latest/?badge=latest)\n\n## Introduction\n\nbifacial_radiance contains a series of Python wrapper functions to make working with \nRADIANCE easier, particularly for the PV researcher interested in bifacial PV \nperformance. For more information, check out our [documentation](https://bifacial-radiance.readthedocs.io), \n[Tutorials in the form of Jupyter Notebooks](docs/tutorials/), or reffer to our [Wiki](https://github.com/NREL/bifacial_radiance/wiki)\nand [Issues](https://github.com/NREL/bifacial_radiance/issues) page. \n\n## Installation\n\nhttps://youtu.be/4A9GocfHKyM\nThis video shows how to install the bifacial_radiance software and all associated software needed. More info on the Wiki. Instructions are also shown below.\n\nFor detailed instructions of how to install bifacial_radiance, you can also refer to the [installation guide](https://bifacial-radiance.readthedocs.io/en/stable/installation.html)\n\n## GUI! \n\nA GUI has been added in version 3.0. The GUI reads/writes all input parameters necessary to run a simulation, and runs the specified simulation by calling the correct functions with the specified parameters. So no need to use a journal or a script! But you still need to install following the procedure below. \n\nTo run the gui, import bifacial_radiance and run bifacial_radiance.gui()\n\n\n\n\n## Usage\n\nWe have a tutorial video, showing how the program is structured, how to use the Jupyter tutorials and the GUI. You can watch it here [Tutorial Webinar](https://www.youtube.com/watch?v=1X9L-R-RVGA), with the [slides available here](https://www.nrel.gov/docs/fy20osti/75218.pdf).\n\nCheck out the [Jupyter Tutorial Notebooks](docs/tutorials/) to see detailed examples of the capacities of bifacial_radiance.\nThe [Intro examples](https://bifacial-radiance.readthedocs.io/en/stable/introexamples.html) and the [readthedocs documentation](https://bifacial-radiance.readthedocs.io) also provide a good starting point.\n\n## Contributing\n\nWe need your help to make bifacial_radiance a great tool! Please see the [Contributing page](https://bifacial-radiance.readthedocs.io/en/stable/contributing.html) for more on how you can contribute. The long-term success of bifacial_radiance requires substantial community support.\n\n## License\n\nBifacial_radiance open source code is copyrighted by the Alliance for Sustainable Energy and licensed with BSD-3-Clause terms, found [here](https://github.com/NREL/bifacial_radiance/blob/master/LICENSE).\n\n## Getting Support\n\nIf you suspect that you may have discovered a bug or if you'd like to\nchange something about bifacial_radiance, then please make an issue on our\n[GitHub issues page](https://github.com/NREL/bifacial_radiance/issues).\n\nbifacial_radiance questions can be asked on\n[Stack Overflow](http://stackoverflow.com) and tagged with\nthe [bifacial_radiance](http://stackoverflow.com/questions/tagged/bifacial_radiance) tag.\n\nThe [bifacial-radiance google group](https://groups.google.com/forum/#!forum/bifacial_radiance) \nhas just started, and will be used for discussing various topics of interest to the bifacial-radiance\ncommunity. We also make new version announcements on the google group.\n\n## Citing\n\nIf you use bifacial_radiance in a published work, please cite:\n\n Ayala Pelaez and Deline, (2020). bifacial_radiance: a python package for modeling bifacial solar photovoltaic systems. Journal of Open Source Software, 5(50), 1865, https://doi.org/10.21105/joss.01865\n\n\nPlease also cite the DOI corresponding to the specific version of bifacial_radiance that you used. bifacial_radiance DOIs are listed at [Zenodo.org](https://zenodo.org/search?page=1&size=20&q=conceptrecid:3860349&all_versions&sort=-version)\n\nAdditional bifacial_radiance publications with validation of the software include:\n* Deline, Chris, and Ayala, Silvana. Bifacial_Radiance. Computer Software. https://github.com/NREL/bifacial_radiance. USDOE Office of Energy Efficiency and Renewable Energy (EERE), Solar Energy Technologies Office (EE-4S). 17 Dec. 2017. Web. doi:10.11578/dc.20180530.16. https://www.osti.gov/doecode/biblio/6869\n* Ayala Pelaez S, Deline C, Greenberg P, Stein JS, Kostuk RK. Model and validation of single-axis tracking with bifacial PV. IEEE J Photovoltaics. 2019;9(3):715-721. https://ieeexplore.ieee.org/document/8644027 and https://www.nrel.gov/docs/fy19osti/72039.pdf (pre-print, conference version)\n* Ayala Pelaez, Deline C, MacAlpine M, Marion B, Stein J, Kostuk K. Comparison of Bifacial Solar Irradiance Model Predictions with Field Validation. IEEE J Photovoltaics. 2019; 9(1):82-87. https://ieeexplore.ieee.org/document/8534404\n\nOr check our [Github Wiki](https://github.com/NREL/bifacial_radiance/wiki) for a complete list of publications.\n""",",https://doi.org/10.5281/zenodo.3860350,https://doi.org/10.21105/joss.01865\n\n\nPlease,https://zenodo.org/search?page=1&size=20&q=conceptrecid:3860349&all_versions&sort=-version","2017/12/13, 17:56:18",2142,BSD-3-Clause,16,1533,"2023/09/26, 07:09:45",96,139,381,30,29,3,0.1,0.39451114922813035,"2023/03/11, 14:04:01",0.4.2,0,10,false,,false,false,,,https://github.com/NREL,http://www.nrel.gov,"Golden, CO",,,https://avatars.githubusercontent.com/u/1906800?v=4,,,
autoXRD,"A Python package for automatic XRD pattern classification of thin-films, tweaked for small and class-imbalanced datasets.",PV-Lab,https://github.com/PV-Lab/autoXRD.git,github,,Photovoltaics and Solar Energy,"2020/03/04, 18:03:26",46,0,7,false,Python,Accelerated Materials Laboratory for Sustainability,PV-Lab,Python,,"b'\nautoXRD\n===========\n## Description\n\n\nautoXRD is a python package for automatic XRD pattern classification of thin-films, tweaked for small and class-imbalanced datasets. The main application of the package is high-throughput screening of novel materials.\n\nautoXRD performs physics-informed data augmentation to solve the small data problem, implements a state-of-the-art a-CNN architecture and allows interpretation using Average Class Activation Maps (CAMs), according to the following publications:\n\n""**Oviedo, F., Ren, Z., Sun, S., Settens, C., Liu, Z., Hartono, N. T. P., ... & Buonassisi, T. (2019). Fast and interpretable classification of small X-ray diffraction datasets using data augmentation and deep neural networks. npj Computational Materials, 5(1), 60."" Link: [https://doi.org/10.1038/s41524-019-0196-x](https://doi.org/10.1038/s41524-019-0196-x)**\n\n\n""**Fast classification of small X-ray diffraction datasets using data augmentation and deep neural networks, (2019), Felipe Oviedo, Zekun Ren, et. al. Link: [arXiv:1811.08425v](https://arxiv.org/abs/1811.08425v2)**\n\nAccepted to NeurIPS 2018 ML for Molecules and Materials Workshop. Final version published npj Computational Materials 2019\n\n\n## Installation\n\nTo install, just clone the following repository:\n\n`$ git clone https://github.com/PV-Lab/autoXRD.git`\n\n## Usage\n\nJust run `space_group_a_CNN.py` , with the given datasets. Note that this performs classification for patterns into 7 space-groups. Dimensionality data is not included in the code, please contact authors if interested.\nThe package contains the following module and scripts:\n\n| Module | Description |\n| ------------- | ------------------------------ |\n| `space_group_a_CNN.py` | Script for XRD space-group classification with a-CNN |\n| `autoXRD` | Module dedicated to XRD pattern preprocessing and data augmentation |\n| `autoXRD_vis` | Visualizer module for class activation maps (CAMs) |\n| `Demo / XRD_dimensionality_demo.ipynb` | Notebook containing a demo for physics-informed data augmentation. This is a version with a modified CNN and no CAM to speed up the computation\n\n\n## Authors\nFelipe Oviedo and ""Danny"" Zekun Ren\n\n\n|| |\n| ------------- | ------------------------------ |\n| **AUTHORS** | Felipe Oviedo and ""Danny"" Ren Zekun | \n| **VERSION** | 1.0 / May, 2019 | \n| **EMAIL OF REPO OWNER** | foviedo@mit.edu | \n|| |\n\n## Attribution\n\nThis work is under an Apache 2.0 License and data policies of Nature Partner Journal Computational Materials. Please, acknowledge use of this work with the apropiate citation.\n\n## Citation\n\n @article{oviedo2019fast, \n title={Fast and interpretable classification of small X-ray diffraction datasets using data augmentation and deep neural networks},\n author={Oviedo, Felipe and Ren, Zekun and Sun, Shijing and Settens, Charles and Liu, Zhe and Hartono, Noor Titan Putri and Ramasamy, Savitha and DeCost, Brian L and Tian, Siyu IP and Romano, Giuseppe and others},\n journal={npj Computational Materials},\n volume={5},\n number={1},\n pages={60},\n year={2019},\n publisher={Nature Publishing Group}}\n'",",https://doi.org/10.1038/s41524-019-0196-x,https://doi.org/10.1038/s41524-019-0196-x,https://arxiv.org/abs/1811.08425v2","2019/04/23, 19:15:53",1646,Apache-2.0,0,32,"2023/09/26, 07:09:45",1,0,0,0,29,0,0,0.09999999999999998,"2019/05/24, 16:47:16",1.0,0,2,false,,false,false,,,https://github.com/PV-Lab,pv.mit.edu,United States of America,,,https://avatars.githubusercontent.com/u/13911947?v=4,,,
BayesProcess,A Python package for Physics informed Bayesian network inference using neural network surrogate model for matching process / variable / performance in solar cells.,PV-Lab,https://github.com/PV-Lab/BayesProcess.git,github,,Photovoltaics and Solar Energy,"2021/08/18, 14:30:54",28,0,1,false,Jupyter Notebook,Accelerated Materials Laboratory for Sustainability,PV-Lab,"Jupyter Notebook,Python",,"b'## Description\n\nBayesProcess is a python package for Physics informed Bayesian network inference using neural network surrogate model for matching process / variable / performance in solar cells.\n\n## Installation\n\nTo install, just clone the following repository:\n\n\npip install -r requirements.txt\n\n\nhttps://github.com/PV-Lab/BayesProcess.git\n\n## Usage\n\nrun `surrogate_model.py` , with the given datasets to create the neural network surrogate for numerical PDE solver.\nrun `Bayes.py` with the saved surrogate model. This performs Bayesian network inference to map the process variable (Temperature) to material descriptors. \nThe package contains the following module and scripts:\n\n| Module | Description |\n| ------------- | ------------------------------ |\n| `JV_surrogate.py` | Script for training neural network JV surrogate model |\n| `Bayes.py` | Script for Bayesian inference using MCMC |\n| `requirements.txt` | required packages |\n\n\n\n## Authors\n""Danny"" Zekun Ren and Felipe Oviedo\n'",,"2019/07/09, 04:43:23",1569,MIT,0,17,"2023/09/26, 07:09:45",2,0,0,0,29,0,0,0.23076923076923073,,,0,2,false,,false,false,,,https://github.com/PV-Lab,pv.mit.edu,United States of America,,,https://avatars.githubusercontent.com/u/13911947?v=4,,,
solcore5,"A multi-scale, Python-based library for the modeling of solar cells and semiconductor materials.",qpv-research-group,https://github.com/qpv-research-group/solcore5.git,github,"photovoltaic,semiconductor,solar-cells,python,hacktoberfest",Photovoltaics and Solar Energy,"2023/09/01, 04:28:31",120,5,31,true,Python,Quantum Photovoltaics Research Group,qpv-research-group,"Python,Fortran,Meson,Shell",https://www.solcore.solar/,"b'[](https://mybinder.org/v2/gh/qpv-research-group/solcore5/develop?urlpath=lab)\n\n[](#contributors-)\n\n[](https://doi.org/10.5281/zenodo.1185316)\n[](http://www.gnu.org/licenses/lgpl.html)\n[](http://solcore5.readthedocs.io/en/latest/?badge=latest)\n\n[](https://codecov.io/gh/qpv-research-group/solcore5)\n[](https://app.codacy.com/gh/qpv-research-group/solcore5?utm_source=github.com&utm_medium=referral&utm_content=qpv-research-group/solcore5&utm_campaign=Badge_Grade_Settings)\n\n\nSolcore\n=======\n\n**Solcore** was born as a modular set of tools, written (almost) entirely in Python 3, to address some of the task we had to solve more. With time, however, it has evolved as a complete semiconductor solver able of modelling the optical and electrical properties of a wide range of solar cells, from quantum well devices to multi-junction solar cells.\n\nPlease, visit [Solcore\\\'s Documentation](http://docs.solcore.solar), the [Tutorial](docs/source/Examples/tutorial.rst) for a step-by-step example of how to use *Solcore* to model a solar cell and also check the [Examples folder](examples) for more specific information and examples of usage.\n\n\n\n## Contributors \xe2\x9c\xa8\n\nThanks goes to these wonderful people ([emoji key](https://allcontributors.org/docs/en/emoji-key)):\n\n\n\n\n\n\n\n\n\n\n\nThis project follows the [all-contributors](https://github.com/all-contributors/all-contributors) specification. Contributions of any kind welcome!'",",https://doi.org/10.5281/zenodo.1185316","2017/10/29, 20:13:23",2186,CUSTOM,161,988,"2023/06/07, 04:06:21",37,157,222,55,140,0,0.4,0.5503778337531486,"2023/02/19, 06:37:16",v5.9.1,7,23,false,,true,true,"PentW0lf/sunglass,AlexNDRmac/sunglass,qpv-research-group/sunglass,aidanobeirne/OptiFit,qpv-research-group/rayflare",,https://github.com/qpv-research-group,https://www.qpvgroup.org,"UNSW (Sydney, Australia) and Imperial College London (UK) until 2017",,,https://avatars.githubusercontent.com/u/48552948?v=4,,,
solax,Read energy usage data from the real-time API on Solax solar inverters.,squishykid,https://github.com/squishykid/solax.git,github,"solax,solar,photovoltaic,home-automation,iot,raspberry-pi",Photovoltaics and Solar Energy,"2023/07/16, 22:40:13",75,253,32,true,Python,,,"Python,Shell,HTML",,"b""# Solax\n\n[](https://github.com/squishykid/solax/actions)\n[](https://pypi.org/project/solax)\n\nRead energy usage data from the real-time API on Solax solar inverters.\n\n* Real time power, current and voltage\n* Grid power information\n* Battery level\n* Temperature and inverter health\n* Daily/Total energy summaries\n\n## Usage\n\n`pip install solax`\n\nThen from within your project:\n\n```\nimport solax\nimport asyncio\n\nasync def work():\n r = await solax.real_time_api('10.0.0.1')\n return await r.get_data()\n\nloop = asyncio.new_event_loop()\nasyncio.set_event_loop(loop)\ndata = loop.run_until_complete(work())\nprint(data)\n```\n\n## Confirmed Supported Inverters\n\nThese inverters have been tested and confirmed to be working. If your inverter is not listed below, this library may still work- please create an issue so we can add your inverter to the list \xf0\x9f\x98\x8a.\n\n* SK-TL5000E\n""",,"2019/04/09, 04:14:24",1660,MIT,8,60,"2023/10/07, 13:02:44",43,46,81,19,18,16,0.7,0.34375,"2022/09/11, 07:06:56",v0.3.0,0,14,false,,false,false,"klaernie/core,merca/core,Mic92/dream2nix-home-assistant,pillezu/homeassistantM2,Airler/home-assistant-core,Vip0r/varta_storage,k3a/ha_core,tomerh2001/core,tibber/core,alekslyse/core,mrazekv/solax-mqtt,MarcHagen/home-assistant-core,flake-it/flake-friend-evaluator,alphapum/core,StevenLooman/home-assistant,CyberFlameGO/core-4,navnit3366/core-dev,SemonCat/core,cisswolff/core,PolarFox/ha-core,JasonGao180/home-assistant,DeliciousHouse/core-canary,Hiosdra/core,VadimDor/VadimDor,vandenberghev/home-assistant,lima-limon-inc/OverlayStats,Redracer93/core,petegallagher/core,kundusurinder/HomeAuto,benbruneau/home-assistant-core,rongquantoh-intel/dependabot-demo,mouyuan123/home-assistant,RACELAND-Automacao-Lda/Homeland,RalphHightower/core,Atrejoe/home-assistant,Secure-Platforms-Lab-W-M/Helion-on-Home-Assistant,igewebs/home-assistant,downside-up-git/du-core-dev,sarvex/home-assistant-core,bjarnekrottje/ha-core,kaovilai/home-assistant-core,peterbacil/home-assistant-core,fustom/core,kevinterface/home-assistant,youdroid/core,Multipas84/core,MDevM20/core,jwalberg/ha-core,joushx/core,ChadMoran/hass-core,NiklasA95/home-assistant,Kipjr/home-assistant-core,jasonmadigan/core,samueltardieu/homeassistant-core,gabe565/core,SukramJ/core,kmplngj/home-assistant-core,jeyrb/home-assistant,MarioRamazzotti/Reisinger,aididhaiqal/core,naycha/home-assistant,octodemo/home-assistant-core,lorek123/home-assistant,charliejones1/core,elijahxb/core,Andre0512/home-assistant,marcelhoogantink/core,conallob/core,karman-de-lange/hass_broadlink_ac,majacQ/core-1,sipgate-io/home-assistant-core,oikarinen/core,aronpedersen/test,Nyaran/core,ryfont/core-1,BenoitAnastay/home-assistant-core,aladin2000/core,MarcJenningsUK/home-assistant,jellespijker/core,rohanmuz2/Home-AI,brenank/home-assistant-core,aptalca/wheels,alidblad/ha-clone,lmendezr/home-assistant-core,iamwillbar/core,bakoorahnin/core,piotr-kubiak/home-assistant,shnolshnol1/core,N-hamdy/home-assistant,mww012/core,Stagie/core,joshuaspence/home-assistant-core,janekbaraniewski/home-assistant,jamescurtin/home-assistant,jfroy/core,stagietek/core,Linean1/Project1,Asuse420/core,u240/core,constructorfleet/home-assistant-loader,eltariel/home-assistant-core,classicvalues/core-2,jeroen84/home-assistant,Nyvek/home-assistant,terrorizer1980/core-2,NisaarAgharia/home.AI,tktf50/core,RubenKelevra/home-assistant_core,rohankumardubey/core,brett-fitz/core,fopina/hass-core,samsunga3888/hass,ekmixon/homeassistant-core,seantrue/core,ronaldburns/home-assistant,yxkj2022/python-core,qyl2021/python-core,cabraliebre/core,cnheider/core,Brianspha/home-assistant,btharper/core,kifeo/core,maruel/core,imchipwood/home-assistant,yoki31/core,savonman/core,sshyran/home-assistant-core,huizebruin/core,theyapps/core,mfugate1/core,sjbelisle/core,mikeyhodl/core,DivanX10/OpenWRT-and-Home-Assistant,linuxserver/wheelie,jisakiel/home-assistant,fleXible/home-assistant,jenniferliddle/core,dewgenenny/core,amnnet/core,informaticacba/home-assistant-core,Grendel7/prometheus-solax-exporter,coltoncat/core,Djelibeybi/home-assistant-core,Swissman1/home-assistant,fredrike/home-assistant,JonanOribe/home-assistant,matholiveira91/home-assistant,austinmroczek/home-assistant,stewart123579/home-assistant-core,SiwatINC/home-assistant-tensorflow,preetyrai11/IOT-HomeAssistance,nickovs/home-assistant,mcanaleta/home-assistant-core,vsevolodpohvalenko/home-assistant,PlumeSolution/core,Danielhiversen/home-assistant,ahayworth/home-assistant,kerstef/home-assistant-core,vasili8m/core,12DEP/hikg,gjohansson-ST/core,paulmonigatti/home-assistant-core,matkastner/core,Antall/core,lbschenkel/home-assistant,12DEP/gfgf,12DEP/hik,artchula/home-assistant,therealpedro/core,blastoise186/core,frankhildebrandt/home-assistant,zeehio/home-assistant,djwmarcx/core,CrossEyeORG/homeassistant,miltos04/core,cyr-ius/home-assistant,DKAutomater/home-assistant,angelnu/core,Leviosa-Shades/core,Whoerr/core,fpetillo/home-assistant,qdotme/HomeAssistantRepository,thindiyeh/core,OpenPeerPower/core,joncar/home-assistant,piotrs112/home-assistant,bwghughes/octoclient,Vaarlion/core,Odianosen25/home-assistant,anngel78/core,ties/home-assistant,brentmaxwell/home-assistant,ConnectionMaster/home-assistant,lozoli/myio-integration,corbanmailloux/home-assistant-core,NigelRook/home-assistant,glance-/home-assistant,dannysauer/core,Sureshkumartv/core,TheLastGimbus/hass-core,alandtse/home-assistant,Pyhass/core,iracigt/home-assistant,mstovenour/home-assistant-core,jupe/home-assistant,ajk12345-code/home-assistant,derekxxx/core,mshiznitzh/core,DevSecNinja/core,ms32035/home-assistant,Watemlifts/home-assistant,escoand/home-assistant,keithkyle1989/core,mikeodr/core,sdwilsh/home-assistant,Verbalinsurection/Home-Assistant-core,diederikdevries/ha,Platinumwrist/core,Violet26/home-assistant,Omrisnyk/home-assistant,Vman45/home-assistant,PakapongDev/home-assistant,brianjcarroll8/Home-Assistant-Core,gzsjw/HomeAssistantStudy,peternijssen/home-assistant,ZachT1711/home-assistant,hacf-fr/home-assistant-core,tribut/home-assistant,timmo001/core,THATDONFC/home-assistant,DarkFox/home-assistant,sampou/homeassistant,artefolio/HomeAssistant,gpalsingh30/home-assistant-core,briangomez2016/core,84KaliPleXon3/home-assistant-core,gagebenne/home-assistant,mahadevanmani/home-assistant,switchpanel/ha-core,yenerhelvacioglu/homeasistant,Abigiris/home-assistant,bhchew/hass,MumiLila/gittest4,nordicenergy/home-assistant,gentoo-mirror/HomeAssistantRepository,onkelbeh/HomeAssistantRepository,Bulbutta/home-away,peterpanfy/homeassistant-sm,Watemlifts/Platform-Integration,Watemlifts/subsystems-,Watemlifts/Alfa,fabiandevia/home,home-assistant/core",,,,,,,,,,
bifacialvf,Bifacial PV View Factor model for system performance calculation.,NREL,https://github.com/NREL/bifacialvf.git,github,,Photovoltaics and Solar Energy,"2022/09/28, 21:40:30",25,1,6,false,Python,National Renewable Energy Laboratory,NREL,Python,https://bifacialvf.readthedocs.io,"b""\n\n# Bifacial PV View Factor model\n[](https://opensource.org/licenses/BSD-3-Clause)\n[](https://travis-ci.org/NREL/bifacialvf)\n[](https://zenodo.org/badge/latestdoi/114160149)\n\nkeywords: python, configuration factor model, electrical model mismatch for bifacial modules.\n\n## Introduction\n\nbifacialvf is a self-contained view factor (or configuration factor) model which\nreplicates a 5-row PV system of infinite extent perpendicular to the module\nrows. The function returns the irradiance profile along the middle (interior)\nrow by default, but user interface options include `'first'`, `'interior'`,\n`'last'`, and `'single'`. Single-axis tracking is supported, and hourly output\nfiles based on TMY inputs are saved. Spatial nonuniformity is reported, with\nmultiple rear-facing irradiances collected on the back of each module row.\n\nBilinear interpolation code add-on to bifacialvf (description below) to pre-generate IV arrays and bifacial coefficients, and to examine the energy production with back side irradiance mismatch for either a portrait or landscape module. \nIncluded are IV curves and bifacial info for a Yingli (standard) module. \n\n## Pre-requisites\nThis software is written for Python 2 or 3. NREL recommends [Anaconda Python](https://www.anaconda.com/download/).\n\n## Install using pip\n[bifacialvf](https://pypi.org/project/bifacialvf/) is at the Python Package Index (PyPI). Use pip to install the latest release in your conda environment or virtualenv:\n\n (myenv)$ pip install bifacialvf\n\n### Install development mode from GitHub\nFor those interested in contributing to bifacialvf:\n\n1. Clone the bifacialvf repository: `$ git clone https://github.com/NREL/bifacialvf.git bifacialvf-main`\n2. Navigate to the repository directory where `setup.py` is located: `$ cd bifacialvf-main`\n3. Install via pip in development mode: `$ pip install -e .`\n\n## Usage\n\nFor usage examples, see the Jupyter notebooks in \\docs\\\n\n## License\nbifacialvf open source code is copyrighted by the Alliance for Sustainable Energy and licensed with BSD-3-Clause terms, found here.\n\n## Citing bifacialVF\n\nIf you use bifacial_radiance in a published work, please cite:\n\n Marion, B., MacAlpine, S., Deline, C., Asgharzadeh, A., Toor, F., Riley, D., \xe2\x80\xa6 Hansen, C. (2017). A Practical Irradiance Model for Bifacial PV Modules: Preprint. In 44th IEEE Photovoltaic Specialists Conference. Washington, DC. https://www.nrel.gov/docs/fy17osti/67847.pdf. NREL/CP-5J00-67847\n\nPlease also cite the DOI corresponding to the specific version of bifacialVF that you used. bifacialvf DOIs are listed at [Zenodo.org](https://zenodo.org/search?page=1&size=20&q=conceptrecid:6369162&all_versions&sort=-version). \n\n Silvana Ovaitt, Chris Deline, Mark Mikofski, & Nick DiOrio. (2022). NREL/bifacialvf: v0.1.8 Release (v0.1.8). Zenodo. https://doi.org/10.5281/zenodo.6369162\n\nBilinear Interpolation based on the publications:\n\n De Soto, W., Klein, S. A., & Beckman, W. A. (2006). Improvement and validation of a model for photovoltaic array performance. Solar Energy, 80(1), 78\xe2\x80\x9388. https://doi.org/10.1016/j.solener.2005.06.010\n\n Marion, B., Rummel, S., & Anderberg, A. (2004). Current--voltage curve translation by bilinear interpolation. Progress in Photovoltaics: Research and Applications, 12(8), 593\xe2\x80\x93607.\n\nbifacialvf: Original code by Bill Marion, Python translation by Silvana Ayala, Updates by Chris Deline & team\nOriginal bilinear interpolation code by Sara MacAlpine, Python translation & Updates by Silvana Ayala\nPVMismatch add-on: PVmismatch code from [PVMismatch](https://github.com/SUNPower/PVMismatch), by Sunpower\n\n\n""",",https://zenodo.org/badge/latestdoi/114160149,https://zenodo.org/search?page=1&size=20&q=conceptrecid:6369162&all_versions&sort=-version,https://doi.org/10.5281/zenodo.6369162\n\nBilinear,https://doi.org/10.1016/j.solener.2005.06.010\n\n","2017/12/13, 19:23:14",2142,CUSTOM,0,198,"2022/03/18, 18:54:05",14,25,40,0,586,1,0.0,0.5555555555555556,"2022/09/28, 21:45:08",0.1.8.1,0,5,false,,false,false,narest-qa/repo40,,https://github.com/NREL,http://www.nrel.gov,"Golden, CO",,,https://avatars.githubusercontent.com/u/1906800?v=4,,,
solaR,Allows for reproducible research both for photovoltaics systems performance and solar radiation.,oscarperpinan,https://github.com/oscarperpinan/solar.git,github,,Photovoltaics and Solar Energy,"2021/10/18, 22:50:16",33,0,1,false,R,,,R,http://oscarperpinan.github.io/solar/,"b""solaR\n=====\n[](https://www.r-pkg.org/pkg/solaR)\n[](https://www.r-pkg.org/pkg/solaR)\n[](https://github.com/oscarperpinan/solar/actions)\n\n[](https://doi.org/10.18637/jss.v050.i09)\n\n\nThe `solaR` package allows for reproducible research both for\nphotovoltaics (PV) systems performance and solar radiation. It\nincludes a set of classes, methods and functions to calculate the sun\ngeometry and the solar radiation incident on a photovoltaic generator\nand to simulate the performance of several applications of the\nphotovoltaic energy. This package performs the whole calculation\nprocedure from both daily and intradaily global horizontal irradiation\nto the final productivity of grid-connected PV systems and water\npumping PV systems.\n\nIt is designed using a set of `S4` classes whose core is a group of\nslots with multivariate time series. The classes share a variety of\nmethods to access the information and several visualization\nmethods. In addition, the package provides a tool for the visual\nstatistical analysis of the performance of a large PV plant composed\nof several systems.\n\nAlthough `solaR` is primarily designed for time series associated to a\nlocation defined by its latitude/longitude values and the temperature\nand irradiation conditions, it can be easily combined with spatial\npackages for space-time analysis.\n\n# Software #\n\nThe stable version of solaR is hosted at\n[CRAN](https://cran.r-project.org/package=solaR). The development\nversion is available at\n[GitHub](https://github.com/oscarperpinan/solar/).\n\nInstall the stable version with:\n\n install.packages('solaR')\n\nYou can install the development version with the [`remotes`](https://github.com/r-lib/remotes) package:\n\n\tremotes::install_github('oscarperpinan/solar')\n\nor with [`devtools`](https://github.com/r-lib/devtools):\n\n devtools::install_github('oscarperpinan/solar')\n\n# Documentation #\n\nThe best place to learn how to use the package is the companion paper\npublished by the Journal of Statistical Software:\n\nPerpi\xc3\xb1\xc3\xa1n Lamigueiro, O. (2012). solaR: Solar Radiation and\nPhotovoltaic Systems with R. Journal of Statistical Software, 50(9),\n1\xe2\x80\x9332. https://doi.org/10.18637/jss.v050.i09\n\n[This book](https://oscarperpinan.github.io/esf/) (in\nSpanish) contains detailed information about solar radiation and\nphotovoltaic systems. In\n[my articles](https://oscarperpinan.github.io/) I frequently use\n`solaR`. \n\n# Citation #\n\nIf you use `solaR`, please cite it in any publication reporting\nresults obtained with this software:\n\nPerpi\xc3\xb1\xc3\xa1n Lamigueiro, O. (2012). solaR: Solar Radiation and\nPhotovoltaic Systems with R. Journal of Statistical Software, 50(9),\n1\xe2\x80\x9332. https://doi.org/10.18637/jss.v050.i09\n\nA BibTeX entry for LaTeX users is:\n\n @Article{,\n title = {{solaR}: Solar Radiation and Photovoltaic Systems with {R}},\n author = {Oscar Perpi{\\~n}{\\'a}n},\n journal = {Journal of Statistical Software},\n year = {2012},\n volume = {50},\n number = {9},\n pages = {1--32},\n\t\tdoi = {10.18637/jss.v050.i09}\n }\n\n""",",https://doi.org/10.18637/jss.v050.i09,https://doi.org/10.18637/jss.v050.i09\n\n,https://doi.org/10.18637/jss.v050.i09\n\nA","2013/07/28, 14:14:41",3741,GPL-3.0,0,220,"2020/05/24, 17:51:32",6,0,12,0,1249,0,0,0.0,"2014/04/27, 16:47:19",v0.38,0,1,false,,false,false,,,,,,,,,,,
SolarTherm,Solar thermal power/fuel station performance simulation and optimization using Modelica.,SolarTherm,https://github.com/SolarTherm/SolarTherm.git,github,"engineering,energy,solar,thermodynamics,optimisation,simulation,modelica,modelica-library",Photovoltaics and Solar Energy,"2023/09/11, 01:34:46",28,0,3,true,Modelica,,SolarTherm,"Modelica,Python,C,Motoko,Shell",,"b'Solar thermal power station performance simulation and optimisation.\n\n[](https://github.com/SolarTherm/SolarTherm/actions/workflows/main.yml)\n[](https://github.com/SolarTherm/SolarTherm/actions/workflows/msys2.yml)\n\n[Documentation](http://solartherm.readthedocs.org/en/latest/)\n\nSee also our wiki, which includes\n* A brief [SolarTherm tutorial](https://github.com/SolarTherm/SolarTherm/wiki/A-brief-tutorial-of-SolarTherm).\n* Instructions on [Building SolarTherm](https://github.com/SolarTherm/SolarTherm/wiki/Building-SolarTherm) on Linux.\n* Instructions for [Running SolarTherm on Windows (MSYS2)](https://github.com/SolarTherm/SolarTherm/wiki/Running-SolarTherm-on-Windows-%28MSYS2%29) (the recommended approach for Windows), or you can try [Running SolarTherm on Windows (using WSL)](https://github.com/SolarTherm/SolarTherm/wiki/Running-SolarTherm-on-Windows-%28using-WSL%29).\n* Instructions on how to [use SolarTherm from OMEdit](https://github.com/SolarTherm/SolarTherm/wiki/Running-SolarTherm-via-OMEdit).\n* A description of our [continuous integration and automated testing](https://github.com/SolarTherm/SolarTherm/wiki/Automated-testing-of-SolarTherm-code) setup.\n* Instructions to [link CoolProp to SolarTherm](https://github.com/SolarTherm/SolarTherm/wiki/Integration-with-CoolProp).\n'",,"2015/07/21, 05:23:10",3018,LGPL-3.0,120,1403,"2023/09/11, 01:34:47",48,18,62,8,44,4,0.3,0.6070615034168565,,,0,8,false,,false,false,,,https://github.com/SolarTherm,,,,,https://avatars.githubusercontent.com/u/15701216?v=4,,,
LibreSolar,Firmware for LibreSolar BMS boards based on bq769x0 or ISL94202.,LibreSolar,https://github.com/LibreSolar/bms-firmware.git,github,,Photovoltaics and Solar Energy,"2023/09/11, 14:18:19",97,0,35,true,C,Libre Solar Project,LibreSolar,"C,C++,CMake,Shell",https://libre.solar/bms-firmware/,"b'# Libre Solar BMS Firmware\n\n\n\nThis repository contains the firmware for Libre Solar Battery Management Systems based on [Zephyr RTOS](https://www.zephyrproject.org/) .\n\n## Development and release model\n\nThe `main` branch is used for ongoing development of the firmware.\n\nReleases are created from `main` after significant updates have been introduced to the firmware. Each release has to pass tests with multiple boards.\n\nA release is tagged with a version number consisting of the release year and a release count for that year (starting at zero). For back-porting of bug-fixes, a branch named after the release followed by `-branch` is created, e.g. `v21.0-branch`.\n\n## Documentation\n\nThe firmware documentation including build instructions and API reference can be found under [libre.solar/bms-firmware](https://libre.solar/bms-firmware/).\n\nIn order to build the documentation locally you need to install Doxygen, Sphinx and Breathe and run `make html` in the `docs` folder.\n\n## License\n\nThis firmware is released under the [Apache-2.0 License](LICENSE).\n'",,"2016/11/30, 11:19:50",2520,Apache-2.0,37,288,"2023/09/24, 15:39:43",6,7,25,5,31,0,1.2857142857142858,0.017421602787456414,"2023/09/11, 15:01:30",v23.1,0,5,false,,false,false,,,https://github.com/LibreSolar,https://libre.solar,"Hamburg, Germany",,,https://avatars.githubusercontent.com/u/17674115?v=4,,,
Charge Controller Firmware,Firmware for LibreSolar MPPT/PWM charge controllers.,LibreSolar,https://github.com/LibreSolar/charge-controller-firmware.git,github,,Photovoltaics and Solar Energy,"2023/08/16, 11:34:30",120,0,13,true,C++,Libre Solar Project,LibreSolar,"C++,C,CMake,Python,Shell,Batchfile",https://libre.solar/charge-controller-firmware/,"b'# Libre Solar Charge Controller Firmware\n\n\n\nThis repository contains the firmware for the different Libre Solar Charge Controllers based on [Zephyr RTOS](https://www.zephyrproject.org/).\n\nCoding style is described [here](https://github.com/LibreSolar/coding-style).\n\n## Development and release model\n\nThe `main` branch is used for ongoing development of the firmware.\n\nReleases are created from `main` after significant updates have been introduced to the firmware. Each release has to pass tests with multiple boards.\n\nA release is tagged with a version number consisting of the release year and a release count for that year (starting at zero). For back-porting of bug-fixes, a branch named after the release followed by `-branch` is created, e.g. `v21.0-branch`.\n\n## Documentation\n\nThe firmware documentation including build instructions and API reference can be found under [libre.solar/charge-controller-firmware](https://libre.solar/charge-controller-firmware/).\n\nIn order to build the documentation locally you need to install Doxygen, Sphinx and Breathe and run `make html` in the `docs` folder.\n\n## License\n\nThis firmware is released under the [Apache-2.0 License](LICENSE).\n'",,"2016/08/02, 12:03:35",2640,Apache-2.0,9,510,"2023/08/16, 11:34:31",10,80,121,1,70,1,1.3,0.10882956878850103,"2021/04/14, 11:02:11",v21.0,0,11,false,,true,true,,,https://github.com/LibreSolar,https://libre.solar,"Hamburg, Germany",,,https://avatars.githubusercontent.com/u/17674115?v=4,,,
pvoutput,Python code for downloading PV data from PVOutput.org.,openclimatefix,https://github.com/openclimatefix/pvoutput.git,github,"pvoutput,python,python-library,solar,nowcasting",Photovoltaics and Solar Energy,"2023/07/05, 10:11:51",30,2,6,true,Python,Open Climate Fix,openclimatefix,Python,,"b'\n[](#contributors-)\n\n\n[](https://codecov.io/gh/openclimatefix/pvoutput)\n\nDownload historical solar photovoltaic data from [PVOutput.org](https://pvoutput.org).\n\nThis code is a work-in-progress. The aim is to provide both a Python library for interacting with [PVOutput.org\'s API](https://pvoutput.org/help.html#api), and a set of scripts for downloading lots of data :)\n\n# Installation\n\n```bash\n$ pip install pvoutput-ocf\n```\n\n## Register with PVOutput.org\n\nYou need to get an API key *and* a system ID from PVOutput.org.\n\nIf you don\'t have a PV system, click the ""energy consumption only"" box\nwhen registering on PVOutput. If you don\'t include a\nsystem ID, then you\'ll get a ""401 Unauthorized"" response from the PVOutput API.\n\nYou can pass the API key and system ID into the `PVOutput` constructor.\nOr, create a `~/.pvoutput.yml` file which looks like:\n\n```yaml\napi_key: \nsystem_id: \n```\n\nThe default location of the `.pvoutput.yml` is the user\'s home directory, expanded from `~`. This can be overridden by setting the `PVOUTPUT_CONFIG` environment variable.\n\ne.g. `export PVOUTPUT_CONFIG=""/my/preferred/location/.pvoutput.yml""`\n\nAlternatively, you can set `API_KEY`, `SYSTEM_ID` and `DATA_SERVICE_URL` (see below) as environmental variables.\n\n### API quotas and paid subscriptions\nPlease see [here](https://pvoutput.org/help/data_services.html) for update info.\n\n#### Free\n\nPVOutput.org gives you 60 API requests per hour. Per request, you can download one day of data for one PV system. (See PVOutput\'s docs for more info about [rate limits](https://pvoutput.org/help/api_specification.html#rate-limits).)\n\n#### Donate\n[Donating to PVOutput.org](https://pvoutput.org/help/donations.html#donations) increases your quota for a year to 300 requests per hour.\n\n#### Paid\nTo get more historical data, you can pay $600 Australian dollars for a year\'s \'Live System History\' subscription for a single country ([more info here](https://pvoutput.org/help/data_services.html). And [here\'s PVOutput.org\'s full price list](https://pvoutput.org/services.jsp)).\nThis allows you to use the [`get batch status`](https://pvoutput.org/help/data_services.html#get-batch-status-service) API to download 900 PV-system-*years* per hour.\n\nIf you have subscribed to PVOutput\'s data service then either\n- add `data_service_url` to your configuration file (`~/.pvoutput.yml`) or\n- pass `data_service_url` to the `PVOutput` constructor.\n\nThe `data_service_url` should end in `.org` (note this dones include the `/service/r2` part of the URL)\nFor example: `data_service_url: https://pvoutput.org/`\n\n\n## Install pvoutput Python library\n\n`pip install -e git+https://github.com/openclimatefix/pvoutput.git@main#egg=pvoutput-ocf\n\n# Usage\n\nSee the [Quick Start notebook](examples/quick_start.ipynb).\n\n## Contributors \xe2\x9c\xa8\n\nThanks goes to these wonderful people ([emoji key](https://allcontributors.org/docs/en/emoji-key)):\n\n\n\n\n\n\n\n\n\n\n\nThis project follows the [all-contributors](https://github.com/all-contributors/all-contributors) specification. Contributions of any kind welcome!\n'",,"2019/07/16, 16:51:03",1562,Apache-2.0,20,350,"2022/12/07, 10:48:22",31,51,90,5,322,3,0.6,0.6733067729083666,"2023/07/05, 10:12:58",v0.1.30,0,12,false,,false,false,"AxReds/solaredge,openclimatefix/PVConsumer",,https://github.com/openclimatefix,openclimatefix.org,London,,,https://avatars.githubusercontent.com/u/48357542?v=4,,,
predict_pv_yield,Use machine learning to map satellite imagery of clouds to solar PV yield.,openclimatefix,https://github.com/openclimatefix/predict_pv_yield.git,github,nowcasting,Photovoltaics and Solar Energy,"2022/04/07, 10:59:27",53,0,10,false,Jupyter Notebook,Open Climate Fix,openclimatefix,"Jupyter Notebook,Python",,"b""# Intro\nEarly experiments on predicting solar electricity generation over the next few hours, using deep learning, satellite imagery, and as many other data sources as we can think of :)\n\nThese experiments are focused on predicting solar PV yield.\n\nPlease see [SatFlow](https://github.com/openclimatefix/satflow/) for complementary experiments on predicting the next few hours of satellite imagery (i.e. trying to predict how clouds are going to move!)\n\nAnd please see [OCF's Nowcasting page](https://github.com/openclimatefix/nowcasting) for more context.\n\n# Installation\n\nFrom within the cloned `predict_pv_yield` directory:\n\n```\nconda env create -f environment.yml\nconda activate predict_pv_yield\npip install -e .\n```\n""",,"2020/09/30, 16:46:56",1120,MIT,0,379,"2023/02/08, 12:35:24",46,20,53,1,259,2,0.1,0.37168141592920356,,,1,3,false,,false,false,,,https://github.com/openclimatefix,openclimatefix.org,London,,,https://avatars.githubusercontent.com/u/48357542?v=4,,,
solar-panel-detection,"Using a combination of AI (machine vision), open data and short-term forecasting, the project aims to determine the amount of solar electricity being put into the UK grid at a given time (i.e., ""right now"", or ""nowcasting"")",alan-turing-institute,https://github.com/alan-turing-institute/solar-panel-detection.git,github,"hut23,hut23-425",Photovoltaics and Solar Energy,"2020/04/22, 08:13:06",17,0,0,false,Jupyter Notebook,The Alan Turing Institute,alan-turing-institute,"Jupyter Notebook,TSQL,Python,Makefile",,"b'# Solar Panel Detection (Turing Climate Action Call)\n\nProject code: R-SPES-115 - Enabling worldwide solar PV nowcasting via machine vision and open data\n\nHut23 issue: https://github.com/alan-turing-institute/Hut23/issues/425\n\n- [Sheffield Solar](https://www.solar.sheffield.ac.uk/)\n- [Open Climate Fix](https://openclimatefix.org/)\n- [Open Street Map](https://www.openstreetmap.org)\n- [Open Infrastructure Map](https://openinframap.org)\n\n## Main Project Description\n\nUsing a combination of AI (machine vision), open data and short term forecasting, the project aims to determine the amount of solar electricity being put into the UK grid at a given time (i.e., \xe2\x80\x9cright now\xe2\x80\x9d, or \xe2\x80\x9cnowcasting\xe2\x80\x9d).\n\nDan Stowell (Queen Mary) and collaborators are working on using a number of datasets, each of which are incomplete and messy, to create an estimate of all solar panels and their orientation in the UK. This will involve some data wrangling to combine a number of geospatial data sources and then use data science methods to determine the solar panel assets across the UK and provide a web service to disseminate the results.\n\nData sources will be from Open Street Maps, which has been tagging solar panels in the UK, as well as other data provided by Sheffield Solar and Open Climate Fix. The REG would be doing most of the data wrangling and machine learning on the project, with the other partners providing data and expertise.\n\n## REG Project\n\n### Goals\n\n1. Aggregate UK solar PV data into a structured format, which can be accessed.\n2. Link the tagged panels in OSM to the other data sources\n\n## Overview of the directory structure\n\n```\n.\n|-- admin -- project process and planning docs\n|-- data\n| |-- as_received -- downloaded data files\n| |-- raw -- manually edited files (replace dummy data)\n| |-- processed\n|-- db -- database creation\n|-- doc -- documentation\n|-- explorations -- exploratory work\n`-- notebooks\n```\n\n\n## Data\n\nData is held in three directories: `as_received` contains the data precisely as\ndownloaded from its original source and in its original format; `raw` contains\ndata that has been manually restructured or reformatted to be suitable for use by\nsoftware in the project (see ""Using this repo"" header below). `processed` contains data that may have been processed in some way, such as by Python code, but is still thought of as \xe2\x80\x9csource\xe2\x80\x9d data.\n\nThe following sources of data are used:\n\n- OpenStreetMap - [Great Britain download (Geofabrik)](https://download.geofabrik.de/europe/great-britain.html).\n - [OSM data types](https://wiki.openstreetmap.org/wiki/Elements)\n - [Solar PV tagging](https://wiki.openstreetmap.org/wiki/Tag:generator:source%3Dsolar)\n- [FiT](https://www.ofgem.gov.uk/environmental-programmes/fit/contacts-guidance-and-resources/public-reports-and-data-fit/installation-reports) - Report of installed PV (and other tech including wind). 100,000s entries.\n- [REPD](https://www.gov.uk/government/publications/renewable-energy-planning-database-monthly-extract) - Official UK data from the ""renewable energy planning database"". It contains large solar farms only.\n- Machine Vision dataset - supplied by Descartes labs (Oxford), not publicly available yet.\n\n## Project outcome\n\nThis repo includes a set of scripts that will take\ninput datasets (REPD, OSM, FiT and machine vision \xe2\x80\x93 each in diff format),\nperform data cleaning/conversion, populate a PostgreSQL database, perform\ngrouping of data where necessary (there are duplicate entries in REPD, multiple solar farm\ncomponents in OSM) and then match entries between the data tables, based on the\nmatching criteria we have come up with.\n\nThe database creation and matching scripts should work with newer versions of the source data files, or at least do so with minimal changes to the data processing (see ""Using this repo"" below).\n\nThe result of matching is a table in the database called `matches` that links the unique identifiers of the\ndata tables. This also contains a column called `match_rule`, which refers to the method by which the match was determined, as documented in [doc/matching](doc/matching.md).\n\n## Using this repo\n\n### Install requirements\n\n1. Install [PostgreSQL](https://www.postgresql.org/download/)\n2. Install Python 3 (version 3.7 or later) and `pip`\n3. Run `pip install -r requirements.txt`\n4. Install [Osmium](https://osmcode.org/osmium-tool/)\n\n### Download and prepare data files\n\n1. Download the following data files from the internet and store locally. We recommend saving these original data files within the directory structure under `data/as_received`:\n - OSM PBF file (GB extract): [Download](https://download.geofabrik.de/europe/great-britain-latest.osm.pbf)\n - FiT reports: Navigate to [ofgem](https://www.ofgem.gov.uk/environmental-programmes/fit/contacts-guidance-and-resources/public-reports-and-data-fit/installation-reports) and click the link for the latest Installation Report (during the Turing project, 30 September 2019 was used), then download the main document AND subsidiary documents\n - REPD CSV file: [Download](https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/879414/renewable-energy-planning-database-march-2020.csv) - this is always the most up to date version\n - Machine Vision dataset: supplied by Descartes labs (Oxford), not publicly available yet.\n2. Navigate to `submodules/compile_osm_solar` and edit the `osmsourcefpath` in `compile_osm_solar.py` so that the file path points to the OSM PBF file you downloaded. After installing the requirements in the submodule README, run `python compile_osm_solar.py`. One of the data files produced is a csv, which we use as source data. You can move this file to `data/as_received`\n3. Carry out manual edits to the data files, as described in [doc/preprocessing](doc/preprocessing.md) and save them in `data/raw` under the names suggested by the doc, replacing the default dummy data files.\n4. Navigate to `data/processed` and type `make` - this will create versions of the data files ready for import to PostgreSQL\n\n### Run the database creation and data matching\n\n4. Make sure you have PostgreSQL on your machine, then run the command: `createdb hut23-425 ""Solar PV database matching""` - this creates the empty database.\n5. Navigate to `db` and run the command `psql -f make-database.sql hut23-425` - this populates the database (see [doc/database](doc/database.md)), carries out some de-duplication of the datasets and performs the matching procedure (see [doc/matching](doc/matching.md)). Note: this may take several minutes.\n\nNote that the above commands require you to have admin rights on your PostgreSQL server. On standard Debian-based machines you could prepend the commands with `sudo -u postgres`, or you could assign privileges to your own user account.\n\n## External collaborators guidance\n\nFrom April 2020 this repo is no longer under active development, however a fork of the project is being created by [Open Climate Fix](https://github.com/openclimatefix) if you wish to open issues and pull requests there.\n'",,"2019/11/13, 11:23:12",1442,MIT,0,341,"2020/04/22, 08:13:07",7,30,55,0,1281,1,0.4,0.33333333333333337,,,0,3,false,,false,false,,,https://github.com/alan-turing-institute,https://turing.ac.uk,,,,https://avatars.githubusercontent.com/u/18304793?v=4,,,
solarpy,"This package aims to provide a reliable solar radiation model, mainly based on the work of Duffie, J.A., and Beckman, W. A., 1974, ""Solar energy thermal processes"".",aqreed,https://github.com/aqreed/solarpy.git,github,"solar-energy,solar-cells,sun-position,beam-irradiance,photovoltaic,python,modeling,simulation,flight-simulation",Photovoltaics and Solar Energy,"2019/09/22, 20:12:22",41,10,13,false,Python,,,Python,,"b'
\n\n[](https://travis-ci.com/aqreed/solarpy)\n[](https://codecov.io/gh/aqreed/solarpy/branch/master)\n[](https://github.com/aqreed/solarpy/raw/master/COPYING)\n[](https://mybinder.org/v2/gh/aqreed/solarpy/master?filepath=examples)\n\n| | |\n| ------ | ------ |\n| Description | Python Solar Radiation model |\n| Author | aqreed |\n| Version | 0.1.3 |\n| Python Version | 3.6 |\n| Requires | Numpy, Matplotlib |\n\nThis packages aims to provide a reliable solar radiation model, mainly based on the work of Duffie, J.A., and Beckman, W. A., 1974, ""Solar energy thermal processes"".\n\nThe main purpose is to generate a **solar beam irradiance** (W/m2) prediction on:\n* **any plane**, thanks to the calculation of the solar vector in NED (North East Down) coordinates, suitable for its use in flight dynamics simulations...\n* **any place of the earth**, taking into account the solar time wrt the standard time, geometric altitude, the latitude influence on solar azimuth and solar altitude as well as sunset/sunrise time and hour angle, etc.\n* **any day of the year**, taking into account the variations of the extraterrestrial radiation, the equation of time, the declination, etc., throughout the year\n\n#### Example 1\nSolar [irradiance](https://en.wikipedia.org/wiki/Solar_irradiance) on the southern hemisphere on October 17, at sea-level 13.01UTC (plane pointing upwards)?\n\n```\nimport numpy as np\nfrom solarpy import irradiance_on_plane\nfrom datetime import datetime\n\nvnorm = np.array([0, 0, -1]) # plane pointing zenith\nh = 0 # sea-level\ndate = datetime(2019, 10, 17, 13, 1) # year, month, day, hour, minute\nlat = -23.5 # southern hemisphere\n\nirradiance_on_plane(vnorm, h, date, lat)\n```\n\nA dedicated Jupyter Notebook on beam irradiance can be found [here](https://github.com/aqreed/solarpy/blob/master/examples/solar_irradiance.ipynb).\n\n#### Example 2\nPower output (in W) of a solar panel with the following characteristics:\n* surface of 2.1 sqm\n* efficiency of 0.2\n* pointing upwards\n* in NYC\n* on December 25, at 16.15\n\n```\nfrom numpy import array\nfrom solarpy import solar_panel\nfrom datetime import datetime\n\npanel = solar_panel(2.1, 0.2, id_name=\'NYC_xmas\') # surface, efficiency and name\npanel.set_orientation(array([0, 0, -1])) # upwards\npanel.set_position(40.73, -73.93, 0) # NYC latitude, longitude, altitude\npanel.set_datetime(datetime(2019, 12, 25, 16, 15)) # Christmas Day!\npanel.power()\n```\n\n#### Example 3\nSolar [declination](https://en.wikipedia.org/wiki/Position_of_the_Sun#Declination_of_the_Sun_as_seen_from_Earth) on August 5?\n\n```\nfrom solarpy import declination\nfrom datetime import datetime\n\ndate = datetime(2019, 8, 5) # August 5\n\ndeclination(date)\n```\n\nPlease find more notebooks on the [\'examples\'](https://github.com/aqreed/solarpy/tree/master/examples) folder that you can open locally, or just try [](https://mybinder.org/v2/gh/aqreed/solarpy/master?filepath=examples) to launch online interactive Jupyter notebooks.\n\n---\n**NOTE**:\nsolarpy is under development and might change in the near future.\n\n---\n\n### Dependencies\n\nThis package depends on Python, NumPy and Matplotlib and is usually tested on Linux with the following versions:\n\nPython 3.6, NumPy 1.16, Matplotlib 3.0\n\n### Installation\n\nsolarpy has been written in Python3, and its version v0.1 is available in PyPi. It can be installed using:\n\n```\n$ pip install solarpy\n```\n\nTo install in development mode:\n\n```sh\n$ git clone https://github.com/aqreed/solarpy.git\n$ cd solarpy\n$ pip install -e .\n```\n\n### Testing\n\nsolarpy recommends py.test for running the test suite. Running from the top directory:\n\n```sh\n$ pytest\n```\n\nTo test coverage (also from the top directory):\n\n```sh\n$ pytest --cov\n```\n\n### Bug reporting\n\nPlease feel free to open an [issue](https://github.com/aqreed/solarpy/issues) on GitHub!\n\n### License\n\nMIT (see `COPYING`)\n'",,"2019/07/28, 15:53:41",1550,MIT,0,257,"2019/08/13, 12:22:22",7,0,1,0,1534,1,0,0.0,,,0,1,false,,false,false,"TalweSingh/DAISE,tbongiov/mettoolbox_tb,kcpgilbert/Weather-Expert-Solar-Forecasting,robinjo78/solarBAG,timcera/mettoolbox,kmiecikmichal/PV_ENERGY_PRODUCTION_FORECASTING,DCC-Lab/projetNeige,samurai-madrid/data-processing,samurai-madrid/reinforcement-learning,ebasanez/samur.ai",,,,,,,,,,
solariot,Leverage your IoT enabled Solar PV Inverter to stream your solar energy usage data to a real time dashboard.,meltaxa,https://github.com/meltaxa/solariot.git,github,"freeboard,iot,solar-energy,sungrow-inverter,dashboard,pvoutput,modbus-sungrow,dweet,influxdb,telemetry",Photovoltaics and Solar Energy,"2023/03/15, 09:33:33",175,0,39,true,Python,,,"Python,JavaScript,Dockerfile",https://solariot.live,"b'# Solariot\n\nLeverage your IoT enabled Solar PV Inverter to stream your solar energy usage\ndata to a real time dashboard.\n\nSolariot will connect directly to your Inverter using Modbus TCP. \n\nCurrently, Solariot is able to talk to a SMA Sunny Boy and Sungrow SH5K & SG5KD inverters. \nSolariot is designed to allow any Modbus TCP enabled inverter to be queried using a Modbus register map.\n\nData is collected and can be streamed to destinations like dweet.io, MQTT, InfluxDB or PVOutput. \nTo visualise the telemetry, use a dashboard such as Grafana. For example, this is Meltaxa\'s Grafana dashboard on \nsolariot.live:\n\n \n
\n
\n\n## Pre-requisites\n\n* The Inverter must be accessible on the network using TCP.\n\n* This Python script should work on most Inverters that talk Modbus TCP. You can \ncustomise your own modbus register file.\n\n* Run on Python 3.5+.\n\n## Installation\n\n1. Download or clone this repository to your local workstation.\n ```\n git clone https://github.com/meltaxa/solariot.git\n cd solariot\n ```\n \n2. Install the required libraries.\n ```\n pip install --upgrade -r requirements.txt\n ```\n \n3. Update the config.py with your values, such as the Inverter\'s IP address, \nport, inverter model (which corresponds to the modbus register file) and the\nregister addresses Solariot should scan from. Enable optional support for MQTT,\nPVOutput, InfluxDB and more.\n\n4. Run the solariot.py script. \n ```\n ./solariot.py\n ```\n * Command line options:\n ```\n -c Python module to load as our config. Default is config.py.\n -v Level of verbosity 0=ERROR 1=INFO 2=DEBUG.\n --one-shot Run Solariot just once then exit.\n ```\n## Docker\n\n1. Create a directory for the config file [config.py].\n\n2. Create a config.py (see config-example.py) and place it in the config directory.\n\n3. Run the Docker image with the volume switch to mount your config directory as /config in the image\n * `docker run -v :/config meltaxa/solariot`\n\nNote that the container runs as UID/GID 2000, so mounted config files will need to be readable. E.G.\n\n```bash\nchgrp 2000 $FILE # Set group of file to 2000\nchown g+r $FILE # Allow group 2000 to read file\n```\n\n## Next Steps\n\nNow that you are collecting the inverter\'s data, you\'ll want to ultimately\ndisplay it in a dashboard as seen above. \n\nThere are many methods to stream the data. Here are a few options, which\ncan be enabled in Solariot. \n\n### Dweet.io and Freeboard\n\nThis is the quickest method and is a good place to start.\n\nMetrics are streamed to dweet.io a free IoT messaging service. No sign up is \nrequired. All you need to do is create a unique identifier by updating the\ndweepy_uuid value in the config.py file.\n\nData can then be visualised using a ~~free~~ low-cost dashboard service from \n[Freeboard](https://freeboard.io/). You\'ll need to create your own dashboard,\nusing dweet.io as your data source.\n\n### MQTT Support\n\nThis is a good way to push data to MQTT topics that you might subscribe various tools \nsuch as Node-Red or Home Assistant to. Running your own MQTT server will mean you can\nalso retrieve these values when your internet is offline.\n\nAll you need to do is to set the `mqtt_server`, `mqtt_port`, `mqtt_username`, \n`mqtt_password` and `mqtt_topic` values in `config.py` file and you\'ll be up \nand running.\n\n### InfluxDB and Grafana\n\nUse a time series database such as \n[InfluxDB](https://github.com/influxdata/influxdb) to store the inverter data as\nit streams in. You\'ll need to install this on your own server.\n\nTo display the data in real time dashboard, you can use \n[Grafana](https://grafana.com/get) to pull the metrics from InfluxDB. You can \neither install your own Grafana server or use their free \n[Grafana hosted solution](https://grafana.com/cloud/grafana).\n\nA json export of solarspy.live Grafana dashboard is available under the grafana folder.\nThe file will require editing to match your InfluxDb settings.\n\n### Prometheus and Grafana\n\n[Prometheus](https://prometheus.io/) can be enabled in config.py by setting `prometheus` to true. the data will then be exported on the port specified by `prometheus_port` (defaults to 8000).\n\nyou can configure [Prometheus](https://prometheus.io/) to scrape this by adding a rule like this to your prometheus.yml\n```\nscrape_configs:\n - job_name: \'solariot\'\n scrape_interval: 30s\n static_configs:\n - targets: [\'localhost:8000\']\n```\n\nalternatively if your using [Kubernetes](https://kubernetes.io/) you can use this [helm chart](https://github.com/slackerlinux85/HelmCharts/tree/master/helm-chart-sources/solariot)\n\n### PVOutput.org\n\nWe offer direct integration to publishing metrics to the \'Add Status\' [API endpoint](https://pvoutput.org/help.html#api-addstatus) of PVOutput.\n\nSupported values are `v1` through to `v6` and an assumption that `v1` and `v3` are values are incremental and reset every day.\n\nAll you need to do is set the `pvoutput_api`, `pvoutput_sid` and `pvoutput_rate_limit` values in `config.py` file and \nyou\'ll be publishing in no time!\n\n## Integration with PVOutput.org and Grafana\n\nIf you are using Grafana as your dashboard, a neat little trick is to then\nincorporate your Grafana panels with your PVOutput as system photos. From your\n[PV Ladder page](https://pvoutput.org/ladder.jsp?f=1&pf=4102&pt=4102&sf=5130&st=5130&country=1&in=Sungrow&pn=Infinity&io=1&oc=0), click on your photos to view the real time Grafana images: \n\n\n\n1. Obtain your Grafana panel direct link, see their documentation: .\n\n2. In your PVOutput ""Edit System"" page, add your Grafana panel link in the \n""Image Link"" field. Append ""&png"" to the link. Note, if the URL is longer than \n100 characters, use a URL shortener service instead (such as ).\nDon\'t forget to append the ""&png"" string to your URL.\n\n3. Now go to your system in the PV Ladder page and click on the photos.\n\n:bulb: Tip: You can add any URL image, such as the latest weather radar image \n:wink:\n\n## Contributions\n\nIf you have created a modbus register map for an inverter, please submit your\nfile as a pull request for Solariot inclusion.\n\n## Acknowledgements\n\n* [michael-robbins](https://github.com/michael-robbins) for Docker support, modbus contrib and other improvements.\n* [rpvelloso](https://github.com/rpvelloso) for the SungrowModbusTcpClient class that enables decryption of comms.\n* [shannonpasto](https://github.com/shannonpasto) for the Sungrow SG3KD modbus map.\n* [ShogunQld](https://github.com/ShogunQld) for the SMA Sunnuyboy modbus map.\n* [zyrorl](https://github.com/zyrorl) for MQTT support contrib.\n'",,"2017/09/15, 12:26:12",2231,MIT,6,158,"2023/03/21, 15:57:53",13,36,69,14,218,1,0.1,0.5106382978723405,,,0,11,true,github,false,false,,,,,,,,,,,
pvanalytics,"Quality control, filtering, feature labeling, and other tools for working with data from photovoltaic energy systems.",pvlib,https://github.com/pvlib/pvanalytics.git,github,"photovoltaic,python,renewable-energy,renewables,solar-energy",Photovoltaics and Solar Energy,"2023/10/17, 20:51:22",76,6,17,true,Python,pvlib,pvlib,Python,https://pvanalytics.readthedocs.io,"b'\n[](https://coveralls.io/github/pvlib/pvanalytics?branch=main)\n[](https://doi.org/10.5281/zenodo.6110569)\n\n\n# PVAnalytics\n\nPVAnalytics is a python library that supports analytics for PV\nsystems. It provides functions for quality control, filtering, and\nfeature labeling and other tools supporting the analysis of PV\nsystem-level data.\n\nPVAnalytics is available at [PyPI](https://pypi.org/project/pvanalytics/)\nand can be installed using `pip`:\n\n pip install pvanalytics\n\nDocumentation and example usage is available at \n[pvanalytics.readthedocs.io](https://pvanalytics.readthedocs.io).\n\n## Library Overview\n\nThe functions provided by PVAnalytics are organized in modules based\non their anticipated use. The structure/organization below is likely\nto change as use cases are identified and refined and as package\ncontent evolves. The functions in `quality` and\n`features` take a series of data and return a series of booleans.\nFor more detailed descriptions, see our\n[API Reference](https://pvanalytics.readthedocs.io/en/stable/api.html).\n\n* `quality` contains submodules for different kinds of data quality\n checks.\n * `data_shifts` contains quality checks for detecting and \n isolating data shifts in PV time series data.\n * `irradiance` provides quality checks for irradiance\n measurements. \n * `weather` has quality checks for weather data (for example tests\n for physically plausible values of temperature, wind speed,\n humidity, etc.)\n * `outliers` contains different functions for identifying outliers\n in the data.\n * `gaps` contains functions for identifying gaps in the data\n (i.e. missing values, stuck values, and interpolation).\n * `time` quality checks related to time (e.g. timestamp spacing)\n * `util` general purpose quality functions.\n\n* `features` contains submodules with different methods for\n identifying and labeling salient features.\n * `clipping` functions for labeling inverter clipping.\n * `clearsky` functions for identifying periods of clear sky\n conditions.\n * `daytime` functions for for identifying periods of day and night.\n * `orientation` functions for labeling data as corresponding to\n a rotating solar tracker or a fixed tilt structure.\n * `shading` functions for identifying shadows.\n* `system` identification of PV system characteristics from data\n (e.g. nameplate power, orientation, azimuth)\n* `metrics` contains functions for computing PV system-level metrics\n'",",https://doi.org/10.5281/zenodo.6110569","2020/02/18, 17:10:34",1345,MIT,17,307,"2023/10/17, 20:51:27",34,112,165,28,7,8,1.8,0.512,"2022/12/16, 19:22:39",v0.1.3,0,7,false,,true,false,"hmendo/chrpa,hmendo/chrpa_test,slacgismo/pv-validation-hub,kperrynrel/time-shift-validation-hub,AlexandreHugoMathieu/pvfault_detection,sandialabs/pvOps",,https://github.com/pvlib,,,,,https://avatars.githubusercontent.com/u/11037261?v=4,,,
IonMonger,A free and fast perovskite solar cell simulator with coupled ion vacancy and charge carrier dynamics in one dimension.,PerovskiteSCModelling,https://github.com/PerovskiteSCModelling/IonMonger.git,github,"perovskite-solar-cells,halide-ion-migration",Photovoltaics and Solar Energy,"2022/12/07, 13:12:54",30,0,7,true,MATLAB,,PerovskiteSCModelling,"MATLAB,Python,TeX",https://sites.google.com/view/ionmonger/home,"b'# IonMonger 2\n\nA drift-diffusion model for ion migration and charge carrier transport across a planar perovskite solar cell (PSC).\n\nThis code can be used to simulate the internal state of a PSC over time. The three core layers of a PSC, namely the electron transport layer, perovskite absorber layer and hole transport layer, are modelled explicitly in one spatial dimension. The model variables are the electric potential, halide ion vacancies (existing only within the perovskite layer), electrons (within the ETL and perovskite layers) and holes (within the perovskite and HTL). A variety of experimental protocols can be simulated, including changes in the applied voltage and/or illumination intensity that occur over timescales on the order of microseconds to hours and impedance spectroscopy. The code also outputs the current density and voltage which can be used to plot the current-voltage characteristics of a PSC, including current-voltage hysteresis due to the movement of halide ion vacancies. Please read the [GUIDE](GUIDE.md) to get started.\n\nFor details of changes to the code since the first release, see the Changelog on the [IonMonger Wiki](https://github.com/PerovskiteSCModelling/IonMonger/wiki).\n\n\n# Use Cases\n\nThis code is intended for use by researchers in the field of perovskite solar cells. Example use cases include:\n\n- simulating current-voltage curves, with the ability to change key material properties in order to investigate trends in performance and the extent of hysteresis\n- simulating photo-current or photo-voltage transients to investigate the effects of halide ion migration\n- visualising the effects of halide ion migration on the internal electrical state of a PSC\n- simulating impedance spectra, predicting and analysing the effects of material properties\n\nThe authors of this code published an investigation into how material properties of the transport layers affect perovskite solar cell performance in [Energy & Environmental Science](https://doi.org/10.1039/C8EE01576G), while working at the Universities of Southampton, Bath and Portsmouth.\n\n\n# Requirements and Other Information\n\nRequirements: MATLAB (version R2021a).\n\nThis code was first created at the University of Southampton in 2016. See [AUTHORS](AUTHORS.md) for a list of contributors and [LICENSE](LICENSE) for the conditions of use.\n\nIf you encounter a problem or any unexpected results, please create an Issue on the GitHub website, add details of the problem (including the error message and MATLAB version number) and attach the parameters.m file in use when the problem occurred. For other enquiries, please contact N.E.Courtier(at)soton.ac.uk.\n\nSome features of the code (for example, IS_solver.m and animate_sections.m) will make use of the Parallel Computing Toolbox and the Image Processing Toolbox for increased performance but can still run if the toolboxes are not installed.\n\n\n# How to Cite this Code\n\nPlease cite the release paper published in the [Journal of Computational Electronics](https://link.springer.com/article/10.1007/s10825-019-01396-2) by using the [citation.bib](citation.bib) file.\n\n\n# Technical Features\n\nThis code is based on the finite element scheme first presented in our paper in [Applied Mathematical Modelling](https://doi.org/10.1016/j.apm.2018.06.051) and is performed on a non-uniform (""tanh"") spatial grid.\n\nFiles in the main folder:\n - master.m for running a single simulation\n - parameters.m for setting the inputs to the simulation\n - reset_path.m adds all subfunctions to the MATLAB path\n - IonMongerLite.mlx for running simulations from a user-friendly interface\n\nThe Code/ folder contains all subfunctions, including\n - a function to turn a list of instructions into a protocol for the light or applied voltage\n - functions that provide the ability to find the steady-state Voc and simulate open-circuit conditions\n - a function to plot current-voltage (`J`-`V`) data as well as the recombination currents (`Jl`, `Jr`)\n\nThe solution is saved in dimensional form into one output file, which also contains the input data.\n\nThe Tests/ folder is for developers and contains a set of tests to check the consistency of future updates.\n'",",https://doi.org/10.1039/C8EE01576G,https://doi.org/10.1016/j.apm.2018.06.051","2019/08/22, 10:21:56",1525,AGPL-3.0,4,69,"2023/09/05, 12:41:46",0,27,29,5,50,0,0.7,0.09259259259259256,"2022/08/02, 14:24:06",v2.0,0,2,false,,false,false,,,https://github.com/PerovskiteSCModelling,,,,,https://avatars.githubusercontent.com/u/45872822?v=4,,,
rayflare,"Provide a flexible, user-friendly Python environment to model complex optical stacks, with a focus on solar cells.",qpv-research-group,https://github.com/qpv-research-group/rayflare.git,github,"physics,optics,raytracing,ray-tracing,rigorous-coupled-wave,transfer-matrix-method,solar-cells,multiscale-simulation",Photovoltaics and Solar Energy,"2023/03/21, 22:40:51",22,1,7,true,Python,Quantum Photovoltaics Research Group,qpv-research-group,"Python,TeX",,"b'[](https://www.gnu.org/licenses/gpl-3.0)\n[](https://codecov.io/gh/qpv-research-group/rayflare)\n[](https://www.codacy.com/gh/qpv-research-group/rayflare?utm_source=github.com&utm_medium=referral&utm_content=qpv-research-group/rayflare&utm_campaign=Badge_Grade)\n[](https://rayflare.readthedocs.io/en/latest/?badge=latest)\n[](https://joss.theoj.org/papers/15647ef7b3dd688b47c1b802a4f50a67)\n\n**Important**: Please check out the [news & updates](https://rayflare.readthedocs.io/en/latest/news.html) page for the most recent updates\nand changes, including any possible backwards compatibility issues. If you have questions, issues, etc., please check the\ndocumentation and (open and closed) [issues](https://github.com/qpv-research-group/rayflare/issues) first,\nor open a new issue using the relevant template.\n\n# rayflare\nOpen-source, integrated optical modelling of complex stacks. RayFlare incorporates the transfer-matrix method (TMM), \nray-tracing and rigorous coupled-wave analysis (RCWA/FMM), in addition to an angular redistribution matrix method which allows multiple \nmethods to be coupled across a single structure to calculate total absorption/reflection/transmission, absorption per \nlayer, and absorption profiles. \n\nYou can view RayFlare\'s documentation, including installation instructions [here](https://rayflare.readthedocs.io/en/latest/).\nThe contributing guidelines are [here](CONTRIBUTING.md) and the Code of Conduct is [here](CODE_OF_CONDUCT.md). This package\nis distributed under a [GNU GPL (version 3) license](GNU_GPL_v3.txt). If you have questions, issues, etc., please check the\ndocumentation first or open an [issue](https://github.com/qpv-research-group/rayflare/issues) using the relevant template.\n\nIf you use RayFlare in your work, please cite the [JOSS paper](https://doi.org/10.21105/joss.03460):\n\n*Pearce, P. M., (2021). RayFlare: flexible optical modelling of solar cells. Journal of Open Source Software, 6(65), 3460. \nhttps://doi.org/10.21105/joss.03460*\n\n\n'",",https://doi.org/10.21105/joss.03460","2019/06/12, 20:01:41",1595,CUSTOM,18,456,"2023/07/18, 03:04:11",12,33,49,8,99,2,0.0,0.0,"2023/03/21, 08:21:09",v1.2.0,0,1,false,,true,true,qpv-research-group/solcore-education,,https://github.com/qpv-research-group,https://www.qpvgroup.org,"UNSW (Sydney, Australia) and Imperial College London (UK) until 2017",,,https://avatars.githubusercontent.com/u/48552948?v=4,,,
pv-terms,Contains nomenclature for PV-relevant terms that are used in modeling and data analysis for PV systems.,DuraMAT,https://github.com/DuraMAT/pv-terms.git,github,,Photovoltaics and Solar Energy,"2020/08/03, 18:19:48",13,0,3,false,Python,,DuraMAT,"Python,Batchfile,Makefile,HTML",,"b""# pv-terms\n\nThe pv-terms project contains nomenclature for PV-relevant terms that are used in modeling and data analysis for PV systems.\n\nThe pv-terms project is a work in progress. The team would greatly appreciate feedback and suggestions. To comment, please open an issue.\n\nFormatted documentation available at http://duramat.github.io/pv-terms/\n\n## Building the documentation\n\nTo build the documentation locally, you'll need to install the sphinx\nrequirements. It's probably a good idea to be working in a virtual\nenvironment, but not strictly necessary. \n\n pip install -r requirements.txt\n\nThere are a few ways to build the docs. To do this, cd into the `docs_source` folder. The first method generates html files in `source/_build/html`:\n\n make html\n\nThe second method does the same thing but then copies the files into `docs/` so that they'll get detected by Github Pages:\n\n make github\n""",,"2020/03/13, 13:05:13",1321,BSD-3-Clause,0,158,"2020/07/23, 14:09:44",12,7,17,0,1189,0,0.0,0.5769230769230769,,,0,4,false,,false,false,,,https://github.com/DuraMAT,,,,,https://avatars.githubusercontent.com/u/55461754?v=4,,,
StatisticalClearSky,Statistical estimation of a clear sky signal from PV system power data.,slacgismo,https://github.com/slacgismo/StatisticalClearSky.git,github,,Photovoltaics and Solar Energy,"2022/06/23, 21:50:25",26,3,3,false,Jupyter Notebook,SLAC GISMo,slacgismo,"Jupyter Notebook,Python,Shell",,"b'# StatisticalClearSky\n\n\n\n | Latest Release | \n \n \n \n \n \n \n \n \n \n \n |
\n\n | License | \n \n \n \n \n | \n
\n\n | Build Status | \n \n \n \n \n \n \n \n \n \n \n \n | \n
\n\n | Code Quality | \n \n \n \n \n \n \n \n | \n
\n\n | Publications | \n \n \n \n \n | \n
\n\n | PyPI Downloads | \n \n \n \n \n | \n
\n\n | Conda Downloads | \n \n \n \n \n | \n
\n\n | Test-Coverage | \n \n \n | \n
\n
\n\n_Statistical estimation of a clear sky signal from PV system power data_\n\nThis project implements an algorithm based on [Generalized Low Rank Models](https://stanford.edu/~boyd/papers/glrm.html) for estimating the output of a solar PV system under clear sky or ""cloudless"" conditions, given only measured power as an input. Noteably, no system configuration information, modeling parameters, or correlated environmental data are required. You can read more about this work in these two papers [[1](https://arxiv.org/abs/1907.08279), [2](https://ieeexplore.ieee.org/abstract/document/8939335)].\n\nWe actually recommend that users generally not invoke this software directly. Instead, we recommend using the API provided by [Solar Data Tools](https://github.com/slacgismo/solar-data-tools).\n\n## Getting Started\n\nYou can install pip package or Anaconda package for this project.\n\n### Recommended: Set up `conda` environment with provided `.yml` file\n\n_Updated September 2020_\n\nWe recommend seting up a fresh Python virutal environment in which to use `solar-data-tools`. We recommend using the [Conda](https://docs.conda.io/projects/conda/en/latest/index.html) package management system, and creating an environment with the environment configuration file named `pvi-user.yml`, provided in the top level of this repository. This will install the `solar-data-tools` package as well.\n\nPlease see the Conda documentation page, ""[Creating an environment from an environment.yml file](https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html#creating-an-environment-from-an-environment-yml-file)"" for more information.\n\n### Installation\n\nIf you are using pip:\n\n```sh\n$ pip install statistical-clear-sky\n```\n\nAs of February 11, 2019, it fails because scs package installed as a dependency of cxvpy expects numpy to be already installed.\n[scs issue 85](https://github.com/cvxgrp/scs/issues/85) says, it is fixed.\nHowever, it doesn\'t seem to be reflected in its pip package.\nAlso, cvxpy doesn\'t work with numpy version less than 1.16.\nAs a work around, install numpy separatly first and then install this package.\ni.e.\n```sh\n$ pip install \'numpy>=1.16\'\n$ pip install statistical-clear-sky\n```\n\nIf you are using Anaconda, the problem described above doesn\'t occur since numpy is already installed. And during statistical-clear-sky installation, numpy is upgraded above 1.16:\n\n```sh\n$ conda install -c slacgismo statistical-clear-sky\n```\n\n#### Solvers\n\nThe default convex solver included with cvxpy is ECOS, which is open source. However this solver tends to fail on problems with >1000 variables, as it does not work for this algorithm.\n\nSo, the default behavior of the code is to use the commercial Mosek solver. Thus, we encourage you to install it separately as below and obtain the license on your own.\n\n* [mosek](https://www.mosek.com/resources/getting-started/) - For using MOSEK solver.\n\n If you are using pip:\n ```sh\n $ pip install -f https://download.mosek.com/stable/wheel/index.html Mosek\n ```\n\n If you are using Anaconda:\n ```sh\n $ conda install -c mosek mosek==8.1.43\n ```\n\nAcademic licenses are available for free here: [https://www.mosek.com/products/academic-licenses/](https://www.mosek.com/products/academic-licenses/)\n\n## Usage\n\n### As a part of Python code or inside Jupyter notebook\n\n#### Example 1: Simplest example with the fewest number of input parameters.\n\nUsing default solver (Open Source solver: ECOS)\n\n```python\nimport numpy as np\nfrom statistical_clear_sky.algorithm.iterative_fitting import IterativeFitting\n\n# Usually read from a CSV file or a database with more data,\n# covering 1 day (column) and a few years (row):\npower_signals_d = np.array([[0.0, 0.0, 0.0, 0.0],\n [1.33389997, 1.40310001, 0.67150003, 0.77249998],\n [1.42349994, 1.51800001, 1.43809998, 1.20449996],\n [1.52020001, 1.45150006, 1.84809995, 0.99949998]])\n\niterative_fitting = IterativeFitting(power_signals_d)\n\niterative_fitting.execute()\n\nclear_sky_signals = iterative_fitting.clear_sky_signals()\ndegradation_rate = iterative_fitting.degradation_rate()\n```\n\n#### Example 2: Estimating clear sky signals without degradation.\n\nYou can estimate clear sky signals based on the assumption that there is no year-to-year degradation.\nIn this case, you can set is_degradation_calculated keyword argument to False in execute method.\nBy default, it\'s set to True.\n\n```python\nimport numpy as np\nfrom statistical_clear_sky.algorithm.iterative_fitting import IterativeFitting\n\n# Usually read from a CSV file or a database with more data,\n# covering 1 day (column) and a few years (row):\npower_signals_d = np.array([[0.0, 0.0, 0.0, 0.0],\n [1.33389997, 1.40310001, 0.67150003, 0.77249998],\n [1.42349994, 1.51800001, 1.43809998, 1.20449996],\n [1.52020001, 1.45150006, 1.84809995, 0.99949998]])\n\niterative_fitting = IterativeFitting(power_signals_d)\n\niterative_fitting.execute(is_degradation_calculated=False)\n\nclear_sky_signals = iterative_fitting.clear_sky_signals()\n```\n\n#### Example 3: Using a different solver.\n\nThe default solver ECOS is not stable with large set of input data.\nThe following example shows how to specify to use Mosek solver by passing solver_type keyword argument (to the constructor).\n\n```python\nimport numpy as np\nfrom statistical_clear_sky.algorithm.iterative_fitting import IterativeFitting\n\n# Usually read from a CSV file or a database with more data,\n# covering 1 day (column) and a few years (row):\npower_signals_d = np.array([[0.0, 0.0, 0.0, 0.0],\n [1.33389997, 1.40310001, 0.67150003, 0.77249998],\n [1.42349994, 1.51800001, 1.43809998, 1.20449996],\n [1.52020001, 1.45150006, 1.84809995, 0.99949998]])\n\niterative_fitting = IterativeFitting(power_signals_d,\n solver_type=\'MOSEK\')\n\niterative_fitting.execute()\n\nclear_sky_signals = iterative_fitting.clear_sky_signals()\ndegradation_rate = iterative_fitting.degradation_rate()\n```\n\n#### Example 4: Setting rank for Generalized Low Rank Modeling.\n\nBy default, rank of low rank matrices is specified to be 6.\nYou can change it by specifying rank_k keyword argument (in the constructor).\n\n```python\nimport numpy as np\nfrom statistical_clear_sky.algorithm.iterative_fitting import IterativeFitting\n\n# Usually read from a CSV file or a database with more data,\n# covering 1 day (column) and a few years (row):\npower_signals_d = np.array([[0.0, 0.0, 0.0, 0.0],\n [1.33389997, 1.40310001, 0.67150003, 0.77249998],\n [1.42349994, 1.51800001, 1.43809998, 1.20449996],\n [1.52020001, 1.45150006, 1.84809995, 0.99949998]])\n\niterative_fitting = IterativeFitting(power_signals_d, rank_k=6)\n\niterative_fitting.execute()\n\n# Get the resulting left low rank matrix and right low rank matrix for evaluation.\nleft_low_rank_matrix = iterative_fitting.left_low_rank_matrix()\n# The above can be also obtained as l_cs_value:\nl_cs_value = iterative_fitting.l_cs_value\n\n# Get the resulting right low rank matrix for evaluation.\nright_low_rank_matrix = iterative_fitting.right_low_rank_matrix()\n# The above can be also obtained as r_cs_value:\nr_cs_value = iterative_fitting.r_cs_value\n\nclear_sky_signals = iterative_fitting.clear_sky_signals()\n\ndegradation_rate = iterative_fitting.degradation_rate()\n# The above can be also obtained as beta_value:\nbeta_value = iterative_fitting.beta_value\n```\n\n#### Example 5: Setting different hyper-parameters for minimization of objective function of Generalized Low Rank Modeling.\n\nThere are three hyper-parameters in the objective function of Generalized Low Rank Modeling, i.e. mu_l, mu_r, and tau.\nBy default, mu_l is set to 1.0, mu_r is set to 20.0, and tau is set to 0.8.\nYou can change it by specifying mu_l, mu_r, and tau keyword arguments in execute method.\n\n```python\nimport numpy as np\nfrom statistical_clear_sky.algorithm.iterative_fitting import IterativeFitting\n\n# Usually read from a CSV file or a database with more data,\n# covering 1 day (column) and a few years (row):\npower_signals_d = np.array([[0.0, 0.0, 0.0, 0.0],\n [1.33389997, 1.40310001, 0.67150003, 0.77249998],\n [1.42349994, 1.51800001, 1.43809998, 1.20449996],\n [1.52020001, 1.45150006, 1.84809995, 0.99949998]])\n\niterative_fitting = IterativeFitting(power_signals_d)\n\niterative_fitting.execute(mu_l=5e2, mu_r=1e3, tau=0.9)\n\nclear_sky_signals = iterative_fitting.clear_sky_signals()\ndegradation_rate = iterative_fitting.degradation_rate()\n```\n\n#### Example 6: Setting different control parameters for minimization of objective function of Generalized Low Rank Modeling.\n\nThere are three control parameters in the objective function of Generalized Low Rank Modeling, i.e. exit criteria - exit_criterion_epsilon, and maximum number of iteration - max_iteration.\nBy default, exit_criterion_epsilon is set to 1e-3, max_iteration is set to 100.\nYou can change it by specifying eps and max_iteration keyword arguments in execute method.\n\n```python\nimport numpy as np\nfrom statistical_clear_sky.algorithm.iterative_fitting import IterativeFitting\n\n# Usually read from a CSV file or a database with more data,\n# covering 1 day (column) and a few years (row):\npower_signals_d = np.array([[0.0, 0.0, 0.0, 0.0],\n [1.33389997, 1.40310001, 0.67150003, 0.77249998],\n [1.42349994, 1.51800001, 1.43809998, 1.20449996],\n [1.52020001, 1.45150006, 1.84809995, 0.99949998]])\n\niterative_fitting = IterativeFitting(power_signals_d)\n\niterative_fitting.execute(exit_criterion_epsilon=1e-6, max_iteration=10)\n\nclear_sky_signals = iterative_fitting.clear_sky_signals()\ndegradation_rate = iterative_fitting.degradation_rate()\n```\n\n#### Example 7: Setting limit on degradation rate.\n\nYou can specify the maximum degradation and minimum degradation by setting max_degradation and min_degradation keyword arguments in execute method.\nBy default, they are set not to be used.\n\n```python\nimport numpy as np\nfrom statistical_clear_sky.algorithm.iterative_fitting import IterativeFitting\n\n# Usually read from a CSV file or a database with more data,\n# covering 1 day (column) and a few years (row):\npower_signals_d = np.array([[0.0, 0.0, 0.0, 0.0],\n [1.33389997, 1.40310001, 0.67150003, 0.77249998],\n [1.42349994, 1.51800001, 1.43809998, 1.20449996],\n [1.52020001, 1.45150006, 1.84809995, 0.99949998]])\n\niterative_fitting = IterativeFitting(power_signals_d)\n\niterative_fitting.execute(max_degradation=0.0, min_degradation=-0.5)\n\nclear_sky_signals = iterative_fitting.clear_sky_signals()\ndegradation_rate = iterative_fitting.degradation_rate()\n```\n\n## Jupyter notebook examples\n\nAlternatively, you can clone this repository (GIT) and execute the example codes under notebooks folder.\n\nSimplest way to install dependencies if you are using pip is by\n\n```sh\n$ pip install -r requirements.txt\n```\n\nAs mentioned in the section, ""Getting Started"" above,\nas of February 11, 2019, it fails because scs package installed as a dependency of cxvpy expects numpy to be already installed.\n[scs issue 85](https://github.com/cvxgrp/scs/issues/85) says, it is fixed.\nHowever, it doesn\'t seem to be reflected in its pip package.\nAlso, cvxpy doesn\'t work with numpy version less than 1.16.\nAs a work around, install numpy separatly first and install the other packages using requirements.txt. i.e.\n```sh\n$ pip install \'numpy>=1.16\'\n$ pip install -r requirements.txt\n```\n\n## Running the tests\n\n### Unit tests (developer tests)\n\n1. GIT clone this project.\n\n2. In the project directory in terminal,\n\n ```\n $ python -m unittest\n ```\n\n This runs all the tests under tests folder.\n\nAll the tests are placed under ""tests"" directory directly under the project directory.\nIt is using ""unittest"" that is a part of Python Standard Library by default.\nThere may be a better unit testing framework.\nBut the reason is to invite as many contributors as possible with variety of background.\n\n### Coding style tests\n\n[pylint](https://www.pylint.org/) is used to check if coding style is conforming to ""PEP 8 -- Style Guide for Python Code""\n\nNote: We are open to use [LGTM](https://lgtm.com/).\nHowever, since we decided to use another code coverage tool [codecov](https://codecov.io/) based on a comment by project\'s Technical Advisory Council, we decided not to use another tool that does code coverage.\nWe are also open to use other coding style tools.\n\nExample of using pylint:\n\nIn the project directory in terminal,\n```\n$ pylint statistical_clear_sky\n```\n\n## Contributing\n\nPlease read [CONTRIBUTING.md](https://github.com/bmeyers/StatisticalClearSky/contributing) for details on our code of conduct, and the process for submitting pull requests to us.\n\n## Versioning\n\nWe use [Semantic Versioning](http://semver.org/) for versioning. For the versions available, see the [tags on this repository](https://github.com/bmeyers/StatisticalClearSky/tags).\n\n## Authors\n\n* **Bennet Meyers** - *Initial work and Main research work* - [Bennet Meyers GitHub](https://github.com/bmeyers)\n\n* **Tadatoshi Takahashi** - *Refactoring and Packaging work and Research support work* - [Tadatoshi Takahashi GitHub](https://github.com/tadatoshi)\n\nSee also the list of [contributors](https://github.com/bmeyers/StatisticalClearSky/contributors) who participated in this project.\n\n## License\n\nThis project is licensed under the BSD 2-Clause License - see the [LICENSE](LICENSE) file for details\n\n## References\n\n[1] B. Meyers, M. Tabone, and E. C. Kara, ""Statistical Clear Sky Fitting Algorithm,"" IEEE Photovoltaic Specialists Conference, 2018.\n\n## Acknowledgments\n\n* The authors would like to thank Professor Stephen Boyd from Stanford University for his input and guidance and Chris Deline, Mike Deceglie, and Dirk Jordan from NREL for collaboration.\n'",",https://zenodo.org/badge/latestdoi/117483201,https://arxiv.org/abs/1907.08279","2018/01/15, 01:38:46",2109,BSD-2-Clause,0,530,"2022/06/23, 21:53:13",5,39,41,0,488,2,0.1,0.4156626506024096,"2022/06/23, 22:03:19",v0.4.6,0,6,false,,false,true,"MichaelHopwood/ForwardForwardOneclass,slacgismo/solar-data-tools,slacgismo/solar-data-pipeline",,https://github.com/slacgismo,https://gismo.slac.stanford.edu/,"SLAC National Accelerator Laboratory, Menlo Park, CA 94025",,,https://avatars.githubusercontent.com/u/19895500?v=4,,,
Photovoltaic_Fault_Detector,Model-definition is a deep learning application for fault detection in photovoltaic plants.,RentadroneCL,https://github.com/RentadroneCL/Photovoltaic_Fault_Detector.git,github,"yolo3,detector-model,model-detection,detection-boxes,fault-detection,solar-energy,photovoltaic-panels,deep-learning,keras,object-detection,tensorflow",Photovoltaics and Solar Energy,"2023/01/17, 13:43:26",32,0,14,true,Jupyter Notebook,simplemap.io (former Rentadrone.cl),RentadroneCL,"Jupyter Notebook,HTML,Python",https://simplemap.io,"b'\n# Photovoltaic Fault Detector\n\n\n[](CODE_OF_CONDUCT.md)\n[](https://www.codetriage.com/rentadronecl/photovoltaic_fault_detector)\n[](https://coveralls.io/github/RentadroneCL/Photovoltaic_Fault_Detector)\n\n[SimpleMap.io](https://simplemap.io/)\n\n## Forum\n\nThis project is part of the [UNICEF Innovation Fund Discourse community](https://unicef-if.discourse.group/c/projects/rentadrone/10). You can post comments or questions about each category of [SimpleMap.io Open-Source Initiative](https://rentadronecl.github.io) algorithms. We encourage users to participate in the forum and to engage with fellow users.\n\n## Summary\n\nModel-definition is a deep learning application for fault detection in photovoltaic plants. In this repository you will find trained detection models that point out where the panel faults are by using radiometric thermal infrared pictures. In [Web-API](https://github.com/RentadroneCL/Web-API) contains a performant, production-ready reference implementation of this repository.\n\n\n\n## To do list:\n\n- [x] Import model detection (SSD & YOLO3)\n- [x] Example use Trained Model\n- [x] Train and Evaluate Model with own data\n- [x] Model Panel Detection (SSD7)\n- [x] Model Panel Detection (YOLO3)\n- [x] Model Soiling Fault Detection (YOLO3)\n- [x] Model Diode Fault Detection (YOLO3)\n- [x] Model Other Fault Detection\n- [x] Model Fault Panel Disconnect\n\n## Requirements\n\n* Python 3.x\n* Numpy\n* TensorFlow 2.x\n* Keras 2.x (in TensorFlow)\n* OpenCV\n* Beautiful Soup 4.x\n\n## Quickstart\nIn the root project execute the following command to install all dependencies project\n\n```\npip install -r requirements.txt\n\n```\nYou need install Jupyter notebook to see the code example. You can find the installation documentation for the [Jupyter platform, on ReadTheDocs](https://jupyter.readthedocs.io/en/latest/install.html) or in github page [here](https://github.com/jupyter/notebook).\n\nFor a local installation, make sure you have pip installed and run:\n```\npip install notebook\n\n```\n\n## Example to use trained model\nIn [\'Example_Prediction\'](Code_Example/Example_prediction.ipynb) this is the example of how to implement an already trained model, it can be modified to change the model you have to use and the image in which you want to detect faults.\n\nIn [\'Example Prediction AllInOne\'](Code_Example/Example%20Detection%20AllInOne.ipynb) this is the example of how implement all trained model, you can use this code for predict a folder of images and have a output image with detection boxes.\n\nIn [\'Example_Prediction_Orthophoto\'](Code_Example/Example_prediction_Ortofoto.ipynb) this is the example of how implement all trained model, you can use this code for predict a Orthophot and have a output image with detection boxes.\n\n\n## Developers\nHelp improve our software! We welcome contributions from everyone, whether to add new features, improve speed, fix existing bugs or add support. [Check our code of conduct](CODE_OF_CONDUCT.md), [the contributing guidelines](CONTRIBUTING.md) and how decisions are made.\n\nAny code contributions are welcomed as long as they are discussed in [Github Issues](https://github.com/RentadroneCL/model-definition/issues) with maintainers. Be aware that if you decide to change something and submit a PR on your own, it may not be accepted.\n\n#### Creating an issue\nYou can open a new issue based on code from an existing pull request. For more information, see [the template for filling issues](https://github.com/RentadroneCL/model-definition/blob/master/.github/ISSUE_TEMPLATE/feature_request.md)\n\n\n# Model Detection\nThe models used for detection are SSD [SSD: Single Shot MultiBox Detector](https://arxiv.org/abs/1512.02325) and YOLOv3 [YOLOv3: An Incremental Improvement] (https://arxiv.org/abs/1804.02767), they are imported from the following repositories:\n* [SSD_Keras](https://github.com/pierluigiferrari/ssd_keras#how-to-fine-tune-one-of-the-trained-models-on-your-own-dataset)\n* [YOLOv3_Keras](https://github.com/experiencor/keras-yolo3)\n\nGrab the pretrained weights of SSD and YOLO3 from [Drive_Weights](https://drive.google.com/drive/folders/1LSc9FkAwJrAAT8pAUWz8aax_biFAMMXS?usp=sharing)\n\n| Model | Pretrained Weights |\n|:-----------:|:-------------------:|\n| SSD7/SSD300 | [Weight VGG16](https://drive.google.com/open?id=1VHTx28tGI94yFqwT_WHp-xkx_8Hh_A31)|\n| YOLO3 | [Weight Full Yolo3](https://drive.google.com/open?id=1cnCQHl-TnOrwb-leug1I0O9vMBaSwJLt)|\n\n\n## Type of Data\nThe images used for the design of this model were extracted by air analysis, specifically: FLIR aerial radiometric thermal infrared pictures, taken by UAV (R-JPEG format). Which were converted into .jpg images for the training of these detection models.\nExample FLIR image:\n\n\n\nSame image in .jpg format:\n\n\n\n## Training\n\n### 1. Data preparation\n\nView folder Train&Test_A/ and Train&Test_S/, example of panel anns and soiling fault anns.\n\nOrganize the dataset into 4 folders:\n\n+ train_image_folder <= the folder that contains the train images.\n\n+ train_annot_folder <= the folder that contains the train annotations in VOC format.\n\n+ valid_image_folder <= the folder that contains the validation images.\n\n+ valid_annot_folder <= the folder that contains the validation annotations in VOC format.\n\nThere is a one-to-one correspondence by file name between images and annotations.\nFor create own data set use LabelImg code from :\n[https://github.com/tzutalin/labelImg](https://github.com/tzutalin/labelImg)\n\n### 2. Edit the configuration file\nThe configuration file for YOLO3 is a json file, which looks like this (example soiling fault ):\n\n```python\n{\n ""model"" : {\n ""min_input_size"": 400,\n ""max_input_size"": 400,\n ""anchors"": [5,7, 10,14, 15, 15, 26,32, 45,119, 54,18, 94,59, 109,183, 200,21],\n ""labels"": [""1""],\n\t""backend"": \t\t""full_yolo_backend.h5""\n },\n\n ""train"": {\n ""train_image_folder"": ""../Train&Test_S/Train/images/"",\n ""train_annot_folder"": ""../Train&Test_S/Train/anns/"",\n\t""cache_name"": ""../Experimento_fault_1/Resultados_yolo3/full_yolo/experimento_fault_1_gpu.pkl"",\n\n ""train_times"": 1,\n\n ""batch_size"": 2,\n ""learning_rate"": 1e-4,\n ""nb_epochs"": 200,\n ""warmup_epochs"": 15,\n ""ignore_thresh"": 0.5,\n ""gpus"": ""0,1"",\n\n\t""grid_scales"": [1,1,1],\n ""obj_scale"": 5,\n ""noobj_scale"": 1,\n ""xywh_scale"": 1,\n ""class_scale"": 1,\n\n\t""tensorboard_dir"": ""log_experimento_fault_gpu"",\n\t""saved_weights_name"": ""../Experimento_fault_1/Resultados_yolo3/full_yolo/experimento_yolo3_full_fault.h5"",\n ""debug"": true\n },\n\n ""valid"": {\n ""valid_image_folder"": ""../Train&Test_S/Test/images/"",\n ""valid_annot_folder"": ""../Train&Test_S/Test/anns/"",\n ""cache_name"": ""../Experimento_fault_1/Resultados_yolo3/full_yolo/val_fault_1.pkl"",\n\n ""valid_times"": 1\n },\n ""test"": {\n ""test_image_folder"": ""../Train&Test_S/Test/images/"",\n ""test_annot_folder"": ""../Train&Test_S/Test/anns/"",\n ""cache_name"": ""../Experimento_fault_1/Resultados_yolo3/full_yolo/test_fault_1.pkl"",\n\n ""test_times"": 1\n }\n}\n```\nThe configuration file for SSD300 is a json file, which looks like this (example soiling fault ) and .txt with name of images (train.txt):\n```\n{\n ""model"" : {\n ""backend"": ""ssd300"",\n ""input"": 400,\n ""labels"": [""1""]\n },\n\n ""train"": {\n ""train_image_folder"": ""Train&Test_S/Train/images"",\n ""train_annot_folder"": ""Train&Test_S/Train/anns"",\n ""train_image_set_filename"": ""Train&Test_S/Train/train.txt"",\n\n ""train_times"": 1,\n ""batch_size"": 12,\n ""learning_rate"": 1e-4,\n ""warmup_epochs"": 3,\n ""nb_epochs"": 100,\n\t ""saved_weights_name"": ""Result_ssd300_fault_1/experimento_ssd300_fault_1.h5"",\n ""debug"": true\n },\n ""valid"": {\n ""valid_image_folder"": ""../Train&Test_D/Test/images/"",\n ""valid_annot_folder"": ""../Train&Test_D/Test/anns/"",\n ""valid_image_set_filename"": ""../Train&Test_D/Test/test.txt""\n },\n\n""test"": {\n ""test_image_folder"": ""Train&Test_S/Test/images"",\n ""test_annot_folder"": ""Train&Test_S/Test/anns"",\n ""test_image_set_filename"": ""Train&Test_S/Test/test.txt""\n }\n}\n```\n\n### 3. Start the training process\n\n`python train_ssd.py -c config.json -o /path/to/result`\n\nor\n`python train_yolo.py -c config.json -o /path/to/result`\n\nBy the end of this process, the code will write the weights of the best model to file best_weights.h5 (or whatever name specified in the setting ""saved_weights_name"" in the config.json file). The training process stops when the loss on the validation set is not improved in 20 consecutive epoches.\n\n### 4. Perform detection using trained weights on image, set of images\n\n`python predict_ssd.py -c config.json -i /path/to/image/or/video -o /path/output/result`\nor\n`python predict_yolo.py -c config.json -i /path/to/image/or/video -o /path/output/result`\n\nIt carries out detection on the image and write the image with detected bounding boxes to the same folder.\n\n## Evaluation\nThe evaluation is integrated into the training process, if you want to do the independent evaluation you must go to the folder ssd_keras-master or keras-yolo3-master and use the following code\n\n`python evaluate.py -c config.json` \nExample:\n`python keras-yolo3-master/evaluate.py -c config_full_yolo_fault_1_infer.json` \n\nCompute the mAP performance of the model defined in `saved_weights_name` on the validation dataset defined in `valid_image_folder` and `valid_annot_folder`.\n\n| Model \t \t| mAP \t\t | Config |\n|:--------------:\t|:------------------:|:------------------:|\n| YOLO3 Soiling \t| 0.7302 \t |[config](config_full_yolo_fault_1_infer.json) |\n| YOLO3 Diode \t| 0.6127 | [config](config_full_yolo_fault_4_infer.json)|\n| YOLO3 Affected Cell | 0.7230 | [config](config_full_yolo_fault_2_infer.json)|\n\n\n# Weights of Trained Models\nAll of weights of this trained model grab from [Drive_Weights](https://drive.google.com/drive/folders/1LSc9FkAwJrAAT8pAUWz8aax_biFAMMXS?usp=sharing)\n\n| Model | Weights Trained | Config |\n|:--------------:|:------------------:|:--------:|\n| SSD7 Panel | [weight](https://drive.google.com/open?id=1qNjfAp9sW1VJh8ewnb3NKuafhZockTqV) | [config](Result_ssd7_panel/config_7_panel.json) |\n| SSD300 Soiling | [weight](https://drive.google.com/open?id=1IiOyYW8yPAh4IALbM_ZVqRhLdxV-ZSPw) | [config](config_300_fault_1.json) |\n| YOLO3 Panel | [weight](https://drive.google.com/open?id=14zgtgDJv3KTvhRC-VOz6sqsGPC_bdrL1) | [config](config_full_yolo_panel_infer.json) |\n| YOLO3 Soiling | [weight](https://drive.google.com/open?id=1YLgkn1wL5xAGOpwd2gzdfsJVGYPzszn-) | [config](config_full_yolo_fault_1_infer.json) |\n| YOLO3 Diode | [weight](https://drive.google.com/open?id=1VUtrK9JVTbzBw5dX7_dgLTMToFHbAJl1) | [config](config_full_yolo_fault_4_infer.json) |\n| YOLO3 Affected Cell | [weight](https://drive.google.com/open?id=1ngyCzw7xF0N5oZnF29EIS5LOl1PFkRRM) | [config](config_full_yolo_fault_2_infer.json) |\n\nThe image used are specified in [Table images](Training_Images.xlsx).\nYou can see some examples in [Summary of results](README_Result.md).\n\n# Contributing\n\nContributions are welcome and will be fully credited. We accept contributions via Pull Requests on GitHub.\n\n## Pull Request Checklist\n\nBefore sending your pull requests, make sure you followed this list.\n\n- Read [contributing guidelines](CONTRIBUTING.md).\n- Read [Code of Conduct](CODE_OF_CONDUCT.md).\n- Check if my changes are consistent with the [guidelines](https://github.com/RentadroneCL/model-definition/blob/master/CONTRIBUTING.md#general-guidelines-and-philosophy-for-contribution).\n- Changes are consistent with the [Coding Style](https://github.com/RentadroneCL/model-definition/blob/master/CONTRIBUTING.md#c-coding-style).\n\n\n'",",https://arxiv.org/abs/1512.02325,https://arxiv.org/abs/1804.02767","2020/01/16, 13:50:45",1378,GPL-3.0,4,152,"2023/04/11, 14:42:56",6,27,32,6,197,2,0.0,0.19827586206896552,,,2,4,false,,true,true,,,https://github.com/RentadroneCL,https://simplemap.io,"Santiago, Chile",,,https://avatars.githubusercontent.com/u/32961692?v=4,,,
nasapower,"Aims to make it quick and easy to automate downloading NASA-POWER global meteorology, surface solar energy and climatology data in your R session as a tidy data frame tibble object for analysis and use in modeling or other purposes.",ropensci,https://github.com/ropensci/nasapower.git,github,"agroclimatology,weather,r,nasa-power,nasa,agroclimatology-data,weather-variables,weather-data,earth-science,rstats,data-access,r-package",Photovoltaics and Solar Energy,"2023/10/17, 23:39:09",92,0,11,true,R,rOpenSci,ropensci,"R,TeX",https://docs.ropensci.org/nasapower,"b'---\noutput: github_document\n---\n\n# {nasapower}: NASA POWER API Client
\n\n\n\n[](https://github.com/ropensci/nasapower/actions) \n[](https://app.codecov.io/gh/ropensci/nasapower) \n[](https://zenodo.org/badge/latestdoi/109224461) \n[](https://www.repostatus.org/#active) \n[](https://github.com/ropensci/software-review/issues/155) \n[](https://doi.org/10.21105/joss.01035)\n[](https://CRAN.R-project.org/package=nasapower)\n\n\n## POWER data vs {nasapower}\n\nPlease note that {nasapower} is **NOT** the source of NASA POWER data.\nIt is only an API client that allows easy access to the data.\n{nasapower} does not redistribute the data or provide it in any way, *we encourage users to follow the requests of the POWER Project Team and properly acknowledge them for the data rather than citing this package* (unless you have actually used it in your work).\n\n >*When POWER data products are used in a publication, we request the following acknowledgement be included:\n ""These data were obtained from the NASA Langley Research Center POWER Project funded through the NASA Earth\n Science Directorate Applied Science Program.""*\n\nThe previous statement that properly cites the POWER data is different than the citation for {nasapower}.\nTo cite this R package, {nasapower}, please use the output from `citation(package = ""nasapower"")` and cite both the package manual, which includes the version you used and the paper which refers to the peer-review of the software package as the functionality of the package has changed and will likely change to match the API in the future as necessary.\n\n## About {nasapower}\n\n{nasapower} aims to make it quick and easy to automate *downloading* of the [NASA-POWER](https://power.larc.nasa.gov) global meteorology, surface solar energy and climatology data in your R session as a tidy data frame `tibble` object for analysis and use in modelling or other purposes.\nPOWER (Prediction Of Worldwide Energy Resource) data are freely available for download with varying spatial resolutions dependent on the original data and with several temporal resolutions depending on the POWER parameter and community.\n\n**Note that the data are not static and may be replaced with improved data.**\nPlease see for detailed information in this regard.\n\n### Quick start\n\n{nasapower} can easily be installed using the following code.\n\n#### From CRAN\n\nThe stable version is available through CRAN.\n\n\n```r\ninstall.packages(""nasapower"")\n```\n\n#### From GitHub for the version in-development\n\nA development version is available through GitHub.\n\n\n```r\nif (!require(""remotes"")) {\n install.packages(""remotes"")\n}\n\nremotes::install_github(""ropensci/nasapower"")\n```\n\n### Example\n\nFetch daily \xe2\x80\x9cag\xe2\x80\x9d community temperature, relative humidity and precipitation for January 1, 1985 for Kingsthorpe, Queensland, Australia.\n\n\n```r\nlibrary(""nasapower"")\ndaily_ag <- get_power(community = ""ag"",\n lonlat = c(151.81, -27.48),\n pars = c(""RH2M"", ""T2M"", ""PRECTOTCORR""),\n dates = ""1985-01-01"",\n temporal_api = ""daily""\n )\ndaily_ag\n```\n\n```\n## NASA/POWER CERES/MERRA2 Native Resolution Daily Data \n## Dates (month/day/year): 01/01/1985 through 01/01/1985 \n## Location: Latitude -27.48 Longitude 151.81 \n## Elevation from MERRA-2: Average for 0.5 x 0.625 degree lat/lon region = 442.77 meters \n## The value for missing source data that cannot be computed or is outside of the sources availability range: NA \n## Parameter(s): \n## \n## Parameters: \n## RH2M MERRA-2 Relative Humidity at 2 Meters (%) ;\n## T2M MERRA-2 Temperature at 2 Meters (C) ;\n## PRECTOTCORR MERRA-2 Precipitation Corrected (mm/day) \n## \n## # A tibble: 1 \xc3\x97 10\n## LON LAT YEAR MM DD DOY YYYYMMDD RH2M T2M PRECTOTCORR\n## \n## 1 152. -27.5 1985 1 1 1 1985-01-01 54.7 24.9 0.9\n```\n\n## Documentation\n\nMore documentation is available in the vignette in your R session, `vignette(""nasapower"")` or available online, .\n\n## Meta\n\n- Please [report any issues or bugs](https://github.com/ropensci/nasapower/issues).\nPlease note that the {nasapower} project is released with a [Contributor Code of Conduct](https://github.com/ropensci/nasapower/blob/main/CODE_OF_CONDUCT.md).\nBy participating in the {nasapower} project you agree to abide by its terms.\n\n- License: MIT\n\n## References\n\n\n\n\n'",",https://zenodo.org/badge/latestdoi/109224461,https://doi.org/10.21105/joss.01035","2017/11/02, 06:08:53",2183,CUSTOM,96,2099,"2023/07/26, 22:54:07",0,6,77,6,90,0,0.5,0.0025759917568263235,"2023/08/20, 00:25:05",v4.0.11,0,4,false,,true,true,,,https://github.com/ropensci,https://ropensci.org/,"Berkeley, CA",,,https://avatars.githubusercontent.com/u/1200269?v=4,,,
pvcompare,A model for comparing the benefits of different PV technologies in a specified local energy system in different energy supply scenarios.,greco-project,https://github.com/greco-project/pvcompare.git,github,,Photovoltaics and Solar Energy,"2021/07/28, 07:50:38",10,0,0,false,Python,GRECO project,greco-project,"Python,Fortran,Shell",,"b'|badge_docs| |badge_CI| |badge_coverage| |badge_zenodo|\n\nDeprecated: |badge_travis| \n\n.. |badge_docs| image:: https://readthedocs.org/projects/pvcompare/badge/?version=latest\n :target: https://pvcompare.readthedocs.io/en/latest/?badge=latest\n :alt: Documentation Status\n\n.. |badge_CI| image:: https://github.com/greco-project/pvcompare/actions/workflows/main.yml/badge.svg\n :target: https://github.com/greco-project/pvcompare/actions/workflows/main.yml\n :alt: Build status\n\n.. |badge_coverage| image:: https://coveralls.io/repos/github/greco-project/pvcompare/badge.svg?branch=dev\n :target: https://coveralls.io/github/greco-project/pvcompare?branch=dev\n :alt: Test coverage\n\n.. |badge_travis| image:: https://travis-ci.com/greco-project/pvcompare.svg?branch=dev\n :target: https://travis-ci.com/greco-project/pvcompare\n\n.. |badge_zenodo| image:: https://zenodo.org/badge/224614782.svg\n :target: https://zenodo.org/badge/latestdoi/224614782\n\n\npvcompare\n~~~~~~~~~\n\nIntroduction\n============\n\n*pvcompare* is a model that compares the benefits of different PV technologies in a specified energy system by running\nan energy system optimization. This model concentrates on the integration of PV technologies into local energy systems but could\neasily be enhanced to analyse other conversion technologies.\n\nThe functionalities include\n\n* calculation of an area potential for PV on rooftops and fa\xc3\xa7ades based on building parameters,\n* calculation of heat and electricity demand time series for a specific amount of people living in these buildings,\n* calculation of PV feed-in time series for a set of PV installations on rooftops and fa\xc3\xa7ades incl. different technologies,\n\n * all technologies in the database of `pvlib `_,\n * a specific concentrator-PV module (`CPV `_) and\n * a module of perovskite-silicon cells (`PeroSI `_),\n\n* calculation of temperature dependent COPs or respectively EERs for heat pumps and chillers,\n* download and formatting of `ERA5 weather data `_ (global reanalysis data set),\n* preparation of data and input files for the energy system optimization,\n* a sensitivity analysis for input parameters and\n* visualisations for the comparison of different technologies.\n\nThe model is being developed within the scope of the H2020 project `GRECO `_.\nThe energy system optimization is based on the `oemof-solph `_ python package,\nwhich *pvcompare* calls via the `Multi-Vector Simulator (MVS) `_, a\ntool for assessing and optimizing Local Energy Systems (LES).\n\nDocumentation\n=============\n\nThe full documentation can be found at `readthedocs `_.\n\nInstallation\n============\n\nTo install *pvcompare* follow these steps:\n\n- Clone *pvcompare* and navigate to the directory ``\\pvcompare`` containing the ``setup.py``:\n\n::\n\n git clone git@github.com:greco-project/pvcompare.git\n cd pvcompare\n\n- Install the package:\n\n::\n\n pip install -e .\n\n- For the optimization you need to install a solver. You can download the open source `cbc-solver `_ from https://ampl.com/dl/open/cbc/ . Please follow the installation `steps `_ in the oemof installation instructions. You also find information about other solvers there.\n\nExamples and basic usage\n========================\nThe basic usage of *pvcompare* is explained in the documentation in section `Basic usage of pvcompare `_.\nExamples are provided on github in the directory `examples/ `_.\n\nContributing\n============\n\nWe are warmly welcoming all who want to contribute to *pvcompare*.\nPlease read our `Contributing Guidelines `_.\nYou can also get in contact by writing an `issue on github `_.\n'",",https://zenodo.org/badge/latestdoi/224614782\n\n\npvcompare\n~~~~~~~~~\n\nIntroduction\n============\n\n*pvcompare*","2019/11/28, 09:16:52",1427,AGPL-3.0,0,2042,"2021/07/20, 14:27:30",32,168,302,0,827,4,1.8,0.5308775731310942,"2021/05/29, 15:29:06",v0.0.3,0,5,false,,false,true,,,https://github.com/greco-project,https://www.greco-project.eu/,,,,https://avatars.githubusercontent.com/u/50671643?v=4,,,
SolTrace,A software tool developed at NREL to model concentrating solar power (CSP) systems and analyze their optical performance.,NREL,https://github.com/NREL/SolTrace.git,github,,Photovoltaics and Solar Energy,"2023/09/25, 17:19:34",36,0,10,true,C++,National Renewable Energy Laboratory,NREL,"C++,HTML,TeX,Python,JavaScript,CSS,CMake,Inno Setup,Batchfile,Makefile",https://www.nrel.gov/csp/soltrace.html,"b'# SolTrace\n\nThe SolTrace Open Source Project repository contains the source code, tools, and instructions to build a desktop version of the National Renewable Energy Laboratory\'s SolTrace. SolTrace is a software tool developed at NREL to model concentrating solar power (CSP) systems and analyze their optical performance. Although ideally suited for solar applications, the code can also be used to model and characterize many general optical systems. The creation of the code evolved out of a need to model more complex solar optical systems than could be modeled with existing tools. For more details about SolTrace\'s capabilities, see the [SolTrace website](https://www.nrel.gov/csp/soltrace.html). For details on integration with SAM, see the [SAM website](https://sam.nrel.gov).\n\nThe desktop version of SolTrace for Windows or Linux builds from the following open source projects:\n\n* [LK](https://github.com/nrel/lk) is a scripting language that is integrated into SAM and allows users to add functionality to the program.\n\n* [wxWidgets](https://www.wxwidgets.org/) is a cross-platform graphical user interface platform used for SAM\'s user interface, and for the development tools included with SSC (SDKtool) and LK (LKscript). The current version of SAM uses wxWidgets 3.1.0.\n\n* [WEX](https://github.com/nrel/wex) is a set of extensions to wxWidgets for custom user-interface elements used by SAM, and by LKscript and DView, which are integrated into SAM.\n\n* This repository, **SolTrace**, provides the user interface to assign values to inputs of the computational modules, run the modules in the correct order, and display calculation results. It also includes tools for editing LK scripts and viewing ray intersection and flux map data.\n\n## Quick Steps for Building SolTrace\n\nFor detailed build instructions see the [wiki](https://github.com/NREL/SolTrace/wiki), with specific instructions for:\n\n* [Windows](https://github.com/NREL/SolTrace/wiki/build-windows)\n* [OSX](https://github.com/NREL/SolTrace/wiki/build-osx)\n* [Linux](https://github.com/NREL/SolTrace/wiki/build-linux)\n\nThese are the general quick steps you need to follow to set up your computer for developing SolTrace:\n\n1. Set up your development tools:\n\n * Windows: Visual Studio 2019 Community or other editions available at [https://www.visualstudio.com/](https://www.visualstudio.com/).\n * Linux: g++ compiler available at [http://www.cprogramming.com/g++.html](http://www.cprogramming.com/g++.html) or as part of the Linux distribution.\n\n2. Download and install CMake 3.19 or higher from [https://cmake.org/download/](https://cmake.org/download/) with the ```Add CMake to the System Path for ...``` option selected.\n\n3. Download the wxWidgets 3.1.5 source code for your operating system from [https://www.wxwidgets.org/downloads/](https://www.wxwidgets.org/downloads/).\n\n4. Build wxWidgets.\n\n5. In Windows, create the WXMSW3 environment variable on your computer to point to the wxWidgets installation folder, or Linux, create the dynamic link `/usr//local/bin/wx-config-3` to point to `/path/to/wxWidgets/bin/wx-config`.\n\n6. As you did for wxWidgets, for each of the following projects, clone (download) the repository and then (Windows only) create an environment variable pointing to the project folder. \n\n\n| Project | Repository URL | Windows Environment Variable |
\n| LK | https://github.com/NREL/lk | LKDIR |
\n| WEX | https://github.com/NREL/wex | WEXDIR |
\n
\n\n7. Run CMake to create the project build files\n 1. Copy the file ```parent-dir-CMakeLists.txt``` into the parent directory also containing ```soltrace/ lk/ wex/``` and ```wxwidgets-3.x.x/``` folders.\n \n 2. Rename this file to ```CMakeLists.txt``` before running cmake. You may need to temporarily rename any other file in this directory with the same name. \n \n E.g., the file should be at ```C:/stdev/CMakeLists.txt```\n\n 3. Create a directory in the main parent folder to store the build files. \n E.g., ```C:/stdev/build-soltrace/```\n \n 4. Open a shell or command window in the build folder from step 3\n\n 5. Copy the following cmake command to the shell and run. Replace the cmake target with a [supported generator](https://cmake.org/cmake/help/latest/manual/cmake-generators.7.html#manual:cmake-generators(7))\n \n ```> cmake -G ""Visual Studio 16 2019"" -DCMAKE_CONFIGURATION_TYPES=""Debug;Release"" -DCMAKE_SYSTEM_VERSION=10.0 -DSAM_SKIP_TOOLS=1 .. ```\n\n 6. Confirm the project files built. If running visual studio, you should see a ```soltrace_ui.sln``` file in the build-soltrace/ directory.\n \n 7. Build all files. The output is stored in the soltrace repository folder, e.g., ```C:/stdev/soltrace/app/deploy/soltrace.exe```. \n\n Note that output is NOT stored in the ```build-soltrace/``` directory!\n\n## Contributing\n\nIf you would like to report an issue with SolTrace or make a feature request, please let us know by adding a new issue on the [issues page](https://github.com/NREL/SolTrace/issues).\n\nIf you would like to submit code to fix an issue or add a feature, you can use GitHub to do so. Please see [Contributing](CONTRIBUTING.md) for instructions.\n\n## License\n\nSolTrace\'s open source code is copyrighted by the Alliance for Sustainable Energy and licensed under a [mixed MIT and GPLv3 license](LICENSE.md). It allows for-profit and not-for-profit organizations to develop and redistribute software based on SolTrace under terms of an MIT license and requires that research entities including national laboratories, colleges and universities, and non-profit organizations make the source code of any redistribution publicly available under terms of a GPLv3 license.\n\n## Citing SolTrace\n\nWe appreciate your use of SolTrace, and ask that you appropriately cite the software in exchange for its open-source publication. Please use one of the following references in documentation that you provide on your work. For general usage citations, the preferred option is:\n\n> Wendelin, T. (2003). ""SolTRACE: A New Optical Modeling Tool for Concentrating Solar Optics."" Proceedings of the ISEC 2003: International Solar Energy Conference, 15-18 March 2003, Kohala Coast, Hawaii. New York: American Society of Mechanical Engineers, pp. 253-260; NREL Report No. CP-550-32866.\n\nFor citations in work that involves substantial development or extension of the existing code, the preferred option is:\n\n> Wendelin, T., Wagner, M.J. (2018). ""SolTrace Open-Source Software Project: [github.com/NREL/SolTrace](https://github.com/NREL/SolTrace)"". National Renewable Energy Laboratory. Golden, Colorado.'",,"2017/06/29, 15:20:21",2309,CUSTOM,10,78,"2022/11/14, 21:08:08",22,18,25,3,344,4,0.2,0.17741935483870963,,,0,4,false,,false,true,,,https://github.com/NREL,http://www.nrel.gov,"Golden, CO",,,https://avatars.githubusercontent.com/u/1906800?v=4,,,
CarrierCapture.jl,A set of codes to compute carrier capture and recombination rates in semiconducting compounds like solar cells.,WMD-group,https://github.com/WMD-group/CarrierCapture.jl.git,github,"defects,semiconductors,electronic-structure,materials-design,solar-cells",Photovoltaics and Solar Energy,"2023/03/03, 15:04:47",41,0,6,true,Jupyter Notebook,Materials Design Group,WMD-group,"Jupyter Notebook,Julia,Python,TeX",https://wmd-group.github.io/CarrierCapture.jl/dev/,"b'[](https://opensource.org/licenses/MIT)\n[](https://julialang.org)\n[](https://wmd-group.github.io/CarrierCapture.jl/dev/)\n[](https://github.com/WMD-group/CarrierCapture.jl/actions/workflows/ci.yml)\n[](https://zenodo.org/badge/latestdoi/130691083)\n[](https://doi.org/10.21105/joss.02102)\n[](https://julialang.org)\n\n\n
\n\n\nA set of codes to compute carrier capture and recombination rates in semiconducting compounds.\nThis topic has a rich history starting from the work by [Huang and Rhys](http://rspa.royalsocietypublishing.org/content/204/1078/406.short).\nOur implementation was inspired by the approach (and FORTRAN code) employed by [Alkauskas and coworkers](https://journals.aps.org/prb/abstract/10.1103/PhysRevB.90.075202), but has been adapted\nto also describe anharmonic potential energy surfaces.\n\n## Installation\n\nThe codes are written in [Julia](https://julialang.org), while the scripts and [Jupyter Notebooks](http://jupyter.org) also contain [Python](https://www.python.org) and use [pymatgen](http://pymatgen.org) and [pawpyseed](https://github.com/kylebystrom/pawpyseed) (tested on Scientific Linux 7 and Linux Mint 18), which are assumed to be installed.\nThe [Brooglie](https://github.com/RedPointyJackson/Brooglie) package is used to solve the time-independent Schr\xc3\xb6dinger equation.\n\nInstall the package by:\n\n```julia\njulia> using Pkg\n\njulia> Pkg.add(PackageSpec(url=""https://github.com/WMD-group/CarrierCapture.jl.git""))\n```\n\nTo run the unit tests for the package, use the `Pkg.test` function. \n\n```julia\njulia> Pkg.test(""CarrierCapture"")\n```\n\n## Development\n\nThe project is hosted on [Github](https://github.com/WMD-group/carriercapture).\nPlease use the [issue tracker](https://github.com/WMD-group/carriercapture/issues/) for feature requests, bug reports and more general questions.\nIf you would like to contribute, please do so via a pull request.\n\n## Usage\n\nA typical workflow will consist of several steps, implemented in a series of short programs, which may be run from the command line. Input for the calculations is provided in `input.yaml`.\n\n 0. Prepare a sequence of atomic structure models with displacements that interpolate between two defect configurations (e.g. a site vacancy in charge states q=0 and q=+1).\n Run single-point energy calculations on these structures, and extract the total energies. Scripts for preprocessing may be found in `script`.\n\n 1. Find a best fit for the energy calculations of the deformed structures (`potential`) to generate potential energy surfaces (PES).\n Solve the 1D Schr\xc3\xb6dinger equation for each PES to obtain their phonon (nuclear) wavefunctions.\n\n 3. Construct configuration coordinate (`conf_coord`) to calculate the wavefunction overlap between each PES, \n which forms part of the temperature-dependent capture coefficient.\n\n\n\nThe command-line interface (`GetPotential.jl` and `GetRate.jl`) is depreciated.\nUse [Jupyter Notebook](http://jupyter.org) [examples](https://github.com/WMD-group/CarrierCapture.jl/blob/master/example/notebook/) as a template.\n\n## Examples\n\nThe following examples are provided to illustrate some of the applications of these codes. The input data has been generated from density functional theory (DFT) using [VASP](https://www.vasp.at), but the framework can easily be adapted to accept output from other electronic structure calculators. \n\n* [SnZn in Cu2ZnSnS4](./example/notebook/Harmonic%20(Sn_Zn).ipynb): Harmonic approximation\n\n* [DX-center in GaAs](./example/notebook/Anharmonic%20(DX%20center).ipynb): Anharmonic fitting\n\n* [Electron-phonon coupling](./example/notebook/e-ph.ipynb): Electron-phonon coupling matrix element\n\n## Theory\n\n> The electronic matrix element frequently causes feelings of discomfort (Stoneham, 1981)\n\nThe capture of electrons or holes by point defects in a crystalline materials requires the consideration of a number of factors including the coupling between electronic and vibrational degrees of freedom. Many theories and approximations have been developed to describe the reaction kinetics.\n\nThe capture coefficient between an initial and final state for this computational set up is given by (eq. 22 in [Alkauskas and coworkers](https://journals.aps.org/prb/abstract/10.1103/PhysRevB.90.075202)):\n\n\n
\n\n\nHere, *V* is the volume of the supercell, *Wif* is the electron-phonon overlap and *\xce\xbeim* and *\xce\xbefn* describe the wavefunctions of the *mth* and *nth* phonons in the initial *i* and final *f* states. The final delta-function term serves to conserve energy and in practice is replaced by a smearing Gaussian of finite width *\xcf\x83*.\n\n### User Warning\n\nThe values produced by this type of analysis procedure are sensitive to the quality of the input. \nWe expect that most input data will have been generated by DFT where the basis set, k-points, and ionic forces have been carefully converged.\nIn addition, the alignment of energy surfaces for defects in different charge states requires appropriate finite-size corrections (e.g. see [Freysoldt and coworkers](https://journals.aps.org/rmp/abstract/10.1103/RevModPhys.86.253)).\n\n### Extended Reading List\n\n#### Theory Development\n\n* [Heny and Lang, Nonradiative capture and recombination by multiphonon emission in GaAs and GaP (1977)](https://journals.aps.org/prb/pdf/10.1103/PhysRevB.15.989)\n*Seminal contribution that introduces many important concepts*\n\n* [Huang, Adiabatic approximation theory and static coupling theory of nonradiative transition (1981)](https://www.worldscientific.com/doi/epdf/10.1142/9789812793720_0009)\n*Context for the static approximation that we employ*\n\n* [Stoneham, Non-radiative transitions in semiconductors (1981)](http://iopscience.iop.org/article/10.1088/0034-4885/44/12/001/meta)\n*Review on theory and various models of recombination*\n\n* [Markvart, Determination of potential surfaces from multiphonon transition rates (1981)](http://iopscience.iop.org/article/10.1088/0022-3719/14/15/002)\n*Discussion and treatment of anharmonicity*\n\n* [Markvart, Semiclassical theory of non-radiative transitions (1981)](http://iopscience.iop.org/article/10.1088/0022-3719/14/29/006/meta)\n*Semiclassical treatment of matrix elements following Landau and Holstein*\n\n#### Applications of CarrierCapture\n\n* [Kavanagh et al, Impact of metastable defect structures on carrier recombination in solar cells (2022)](https://pubs.rsc.org/en/content/articlelanding/2022/fd/d2fd00043a)\n\n* [Kavanagh et al, Rapid recombination by cadmium vacancies in CdTe (2021)](https://doi.org/10.1021/acsenergylett.1c00380)\n\n* [Whalley et al, Giant Huang\xe2\x80\x93Rhys factor for electron capture by the iodine intersitial in perovskite solar cells (2021)](https://pubs.acs.org/doi/full/10.1021/jacs.1c03064)\n\n* [Kim and Walsh, Ab initio calculation of the detailed balance limit to the photovoltaic efficiency of single p-n junction kesterite solar cells (2021)](https://aip.scitation.org/doi/10.1063/5.0049143) \n\n* [Dahliah et al, High-throughput computational search for high carrier lifetime, defect-tolerant solar absorbers (2021)](https://pubs.rsc.org/en/content/articlelanding/2021/EE/D1EE00801C)\n\n* [Kim et al, Upper limit to the photovoltaic efficiency of imperfect crystals (2020)](https://dx.doi.org/10.1039/D0EE00291G) \n\n* [Kim et al, Anharmonic lattice relaxation during non-radiative carrier capture (2019)](https://journals.aps.org/prb/abstract/10.1103/PhysRevB.100.041202) \n\n* [Kim et al, Lone-pair effect on carrier capture in Cu2ZnSnS4 solar cells (2019)](https://pubs.rsc.org/en/content/articlehtml/2019/ta/c8ta10130b)\n\n* [Kim et al, Identification of killer defects in kesterite thin-film solar cells (2018)](https://pubs.acs.org/doi/abs/10.1021/acsenergylett.7b01313)\n'",",https://zenodo.org/badge/latestdoi/130691083,https://doi.org/10.21105/joss.02102,https://doi.org/10.1021/acsenergylett.1c00380","2018/04/23, 12:10:13",2011,MIT,18,298,"2021/05/06, 23:51:49",0,9,18,0,901,0,0.1111111111111111,0.6124031007751938,"2023/04/14, 13:09:00",v0.6,0,10,false,,false,false,,,https://github.com/WMD-group,https://wmd-group.github.io,London,,,https://avatars.githubusercontent.com/u/1716969?v=4,,,
honeybee,"A Python library to create, run and visualize the results of daylight (RADIANCE) and energy analysis (EnergyPlus/OpenStudio).",ladybug-tools,https://github.com/ladybug-tools/honeybee.git,github,,Photovoltaics and Solar Energy,"2021/07/21, 11:22:06",90,0,4,false,Python,Ladybug Tools,ladybug-tools,"Python,Shell,Dockerfile",http://ladybug-tools.github.io/honeybee/docs,"b""\n\n[](https://travis-ci.org/ladybug-tools/honeybee)\n[](https://github.com/semantic-release/semantic-release)\n[](https://coveralls.io/github/ladybug-tools/honeybee)\n\n[](https://www.python.org/downloads/release/python-270/) [](https://github.com/IronLanguages/ironpython2/releases/tag/ipy-2.7.8/)\n\n# honeybee\n\nHoneybee is a Python library to create, run and visualize the results of daylight ([RADIANCE](https://radiance-online.org//)) and energy analysis ([EnergyPlus](https://energyplus.net/)/[OpenStudio](https://www.openstudio.net/)). The current version supports only Radiance integration. For energy simulation you may use the [legacy honeybee for Grasshopper](https://github.com/mostaphaRoudsari/honeybee).\n\nThis repository includes the core library which is the base for Honeybee plugins. For plugin-specific questions and comments refer to [honeybee-grasshopper](https://github.com/ladybug-tools/honeybee-grasshopper) or [honeybee-dynamo](https://github.com/ladybug-tools/honeybee-dynamo) repositories.\n\nCheck [this repository](https://github.com/mostaphaRoudsari/honeybee) for the legacy honeybee plugin for Grasshopper.\n\n## Installation\n\n```\npip install lbt-honeybee==0.1.16\n```\n\n## Tentative road map\n- [x] Basic Radiance Integration.\n- [x] Support annual daylight simulation - daylight coefficient method [Nov 2016].\n- [x] Support three-phase daylight simulation [Dec 2016].\n- [x] Support five-phase daylight simulation [Aug 2017].\n- [x] Fix PEP 8 issues [Dec 2017].\n- [x] Code documentation [Dec 2017].\n- [ ] Provide cloud service support for daylight simulation [Under progress]\n- [x] Basic EnergyPlus integration [Nov 2019]\n- [ ] Support basic HVAC modeling.\n- [ ] Full OpenStudio integration.\n\n\n## [API Documentation](http://ladybug-tools.github.io/apidoc/honeybee)\n\n## Citing honeybee\n\nFor the daylighting library cite this presentation:\n\n*Sadeghipour Roudsari, Mostapha. Subramaniam, Sarith. 2016. Automating Radiance workflows with Python. The 15th Annual Radiance Workshop. Padua, Italy. Available at: https://www.radiance-online.org/community/workshops/2016-padua/presentations/213-SadeghipourSubramaniam-AutomatingWorkflows.pdf*\n`\n\n## Examples\nHere is a Python example that shows how to put a grid-based analysis together. For more examples check one of the plugins repository.\n\n```python\nfrom honeybee_plus.room import Room\nfrom honeybee_plus.radiance.material.glass import Glass\nfrom honeybee_plus.radiance.sky.certainIlluminance import CertainIlluminanceLevel\nfrom honeybee_plus.radiance.recipe.pointintime.gridbased import GridBased\n\n# create a test room\nroom = Room(origin=(0, 0, 3.2), width=4.2, depth=6, height=3.2,\n rotation_angle=45)\n\n# add fenestration\n# # add a window to the back wall\nroom.add_fenestration_surface(wall_name='back', width=2, height=2, sill_height=0.7)\n\n# add another window with custom material. This time to the right wall\nglass_60 = Glass.by_single_trans_value('tvis_0.6', 0.6)\nroom.add_fenestration_surface('right', 4, 1.5, 1.2, radiance_material=glass_60)\n\n# run a grid-based analysis for this room\n# generate the sky\nsky = CertainIlluminanceLevel(illuminance_value=2000)\n\n# generate grid of test points\nanalysis_grid = room.generate_test_points(grid_size=0.5, height=0.75)\n\n# put the recipe together\nrp = GridBased(sky=sky, analysis_grids=(analysis_grid,), simulation_type=0,\n hb_objects=(room,))\n\n# write simulation to folder\nbatch_file = rp.write(target_folder='.', project_name='room')\n\n# run the simulation\nrp.run(batch_file, debug=False)\n\n# results - in this case it will be an analysis grid\nresult = rp.results()[0]\n\n# print the values for each point\nfor value in result.combined_value_by_id():\n print('illuminance value: %d lux' % value[0])\n```\n""",,"2015/12/24, 23:43:38",2861,GPL-3.0,0,797,"2022/02/02, 08:08:41",38,282,453,0,630,2,0.0,0.4323899371069182,"2020/08/13, 22:16:10",v1.0.1,0,8,false,,true,true,,,https://github.com/ladybug-tools,ladybug.tools,Worldwide,,,https://avatars.githubusercontent.com/u/14942270?v=4,,,
Open Solar Project,ESP32 Smart Solar Charger.,opensolarproject,https://github.com/opensolarproject/OSPController.git,github,,Photovoltaics and Solar Energy,"2023/10/17, 04:24:49",217,0,37,true,C++,Open Solar Project,opensolarproject,"C++,Python,C",https://github.com/opensolarproject/OSPController/wiki,"b'
\n\n# OSP Controller \xe2\x98\x80\xef\xb8\x8f\xf0\x9f\x95\xb9 _now on [discord](https://discord.gg/GtR3JShfGu)_\n\n_DC -> DC -> DC_ Solar. With a single used solar panel, a few used batteries, and $40 in parts you can power your life, transportation and all. Add an ESP32 Arduino to a 95% efficient DC-DC buck converter controlled over serial and you get an internet-connected, privately hosted smart solar MPPT power system. [Parts list](https://github.com/opensolarproject/OSPController/wiki/Step-1-Parts-List). [Instructions](https://github.com/opensolarproject/OSPController/wiki). [About](https://github.com/opensolarproject/OSPController/wiki/About). Go build one! (And reach out! I\'m happy to help)\n
\n\n[](https://github.com/opensolarproject/OSPController/releases/latest)\n[](https://github.com/opensolarproject/OSPController/commits)\n[](https://github.com/opensolarproject/OSPController/commits)\n[](https://github.com/opensolarproject/OSPController/releases/latest)\n[](https://travis-ci.com/github/opensolarproject/OSPController)\n\n[](https://github.com/opensolarproject/OSPController/commits)\n[](https://github.com/arendst/Tasmota/stargazers)\n[](https://github.com/opensolarproject/OSPController/network)\n[](https://github.com/opensolarproject/OSPController/issues)\n[](https://github.com/opensolarproject/OSPController/issues)\n[](https://discord.gg/GtR3JShfGu)\n\n|  |\n:-------------------------:|\n| A dashboard view Grafana (optional). More details & options [here](https://github.com/opensolarproject/OSPController/wiki/Step-4-Data-Visualization) |\n\n### This solar controller:\n- Costs less than $35 in [total parts](https://github.com/opensolarproject/OSPController/wiki/Step-1-Parts-List)\n- Works with 12 - 82VDC Solar Panels, _(enabling big and efficient strings of panels!)_\n- Works with 4.2 - 60VDC batteries. Directly charge your high-voltage eBike batteries!\n- Is open source, modify it as you wish!\n- Connects to your MQTT smart home\n- Lets you own your own data\n- Gives you [graphs and charts](https://github.com/opensolarproject/OSPController/wiki/Step-4:-Data-Visualization) about your system from anywhere\n\n### But really, head over [to the wiki](https://github.com/opensolarproject/OSPController/wiki) for \n\n- [Background & About](https://github.com/opensolarproject/OSPController/wiki/About)\n- [Part 1:Parts](https://github.com/opensolarproject/OSPController/wiki/Step-1-Parts-List)\n- [Part 2:Hardware](https://github.com/opensolarproject/OSPController/wiki/Step-2-Hardware-Build)\n- [Part 3:Software](https://github.com/opensolarproject/OSPController/wiki/Step-3-Software-Setup)\n- [Part 4:Data](https://github.com/opensolarproject/OSPController/wiki/Step-4-Data-Visualization)\n- [Part 5:Wiring](https://github.com/opensolarproject/OSPController/wiki/Step-5-Wiring-Things)\n\n## Also join the [Discord Channel](https://discord.gg/GtR3JShfGu)\nIt\'s the discussion board to talk shop, get ideas, get help, triage issues, and share success! [discord.gg/MRQvKR](https://discord.gg/GtR3JShfGu)\n\n'",,"2019/08/09, 20:33:23",1537,GPL-3.0,3,92,"2023/10/17, 04:16:21",13,3,17,2,8,1,0.0,0.02197802197802201,"2020/06/26, 19:38:24",v2.0,0,3,false,,false,false,,,https://github.com/opensolarproject,,,,,https://avatars.githubusercontent.com/u/53953954?v=4,,,
MPPT-Solar-Charger,Supporting documentation and software for the MPPT Solar Charger.,danjulio,https://github.com/danjulio/MPPT-Solar-Charger.git,github,,Photovoltaics and Solar Energy,"2023/05/09, 03:26:46",123,0,39,true,C,,,"C,Makefile,C++,Assembly",,"b'## makerPower\xe2\x84\xa2 MPPT Solar Charger\n\n\n\n### LiFePO4 support (added 2/2023)\nFirmware version 2.0 adds support for 12V (4-cell) LiFePO4 batteries. Several charging parameters, shown below, are changed when a LiFePO4 battery type is selected.\n\n1. Float/Bulk initial charge state threshold set to 13.2V\n2. Bulk Voltage set to 14.4V\n3. Power On charge voltage set to 13.6V\n4. Temperature Compensation disabled\n5. Charge temperature range between 0\xc2\xb0C and 50\xc2\xb0C\n\nThe charger will default to the lead acid battery type. Add a jumper between the test pads shown below to configure the charger for a LiFePO4 battery.\n\n\n\nInstructions are provided in the Firmware directory for upgrading existing boards.\n\n### Contents\nThis repository contains documentation and software for the makerPower MPPT Solar Charger board (design documented at [hackaday.io](https://hackaday.io/project/161351-solar-mppt-charger-for-247-iot-devices)). It can be found in my [tindie store](https://www.tindie.com/products/globoy/mppt-solar-charger-for-intelligent-devices/).\n\n1. firmware - Charger C source code\n2. hardware - Board documentation, schematic and connection diagrams for different uses\n3. arduino - Arduino library and examples (can be compiled with wiringPi for Raspberry Pi too)\n4. mppt_dashboard - Mac OS, Windows and Linux monitoring application that communicates with the charger via the mpptChgD daemon\n5. mpptChgD - Linux Daemon compiled for Raspberry Pi that communicates with the charger via I2C\n\nThe makerPower is a combination solar battery charger and 5V power supply for IOT-class devices designed for 24/7 operation off of solar power. It manages charging a 12V AGM lead acid or LiFePO4 battery from common 36-cell 12V solar panels. It provides 5V power output at up to 2A for systems that include sensors or communication radios. Optimal charging is provided through a dynamic perturb-and-observe maximum power-point transfer converter (MPPT) and a 3-stage (BULK, ABSORPTION, FLOAT) charging algorithm. A removable temperature sensor provides temperature compensation. Operation is plug&play although additional information and configuration may be obtained through a digital interface.\n\n* Optimized for commonly available batteries in the 7-18 Ah range and solar panels in the 10-35 Watt range\n* Reverse Polarity protected solar panel input with press-to-open terminal block\n* Fused battery input with press-to-open terminal block\n* Maximum 2A at 5V output on USB Type A power output jack and solder header\n* Automatic low-battery disconnect and auto-restart on recharged battery\n* Temperature compensation sensor with internal sensor fallback (lead acid batteries)\n* Disable charging when battery is too cold or too hot\n* Status LED indicating charge and power conditions, fault information\n* I2C interface for detailed operation condition readout and configuration parameter access\n* Configurable battery charge parameters\n* Status signals for Night detection and pre-power-down alert\n* Night-only operating mode (switch 5V output on only at night)\n* Watchdog functionality to power-cycle connected device if it crashes or for timed power-off control\n\n### Applications\n* Remote control and sense applications\r* Solar powered web or timelapse camera\r* Night-time \xe2\x80\x9ccritter cam""\r* Solar powered LED night lighting controller\n\n#### Bonus Application\nThe charger works well as a 12- and/or 5-V UPS when combined with a laptop power supply. The laptop supply should be able to supply at least 3.5A at between 18.5 - 21V output (for example a Dell supply at 20V/3.5A) - a high enough voltage to initiate charging. The charger will both charge the battery and supply the load current to the user\'s device and the battery will supply power if AC power fails.\n\n### Compatible Solar Panels and Batteries\nThe makerPower is designed to use standard 25- or 35-Watt 12V solar panels with 7-Ah to 18-Ah 12V AGM type lead acid or LiFePO4 batteries. It has a maximum charge capacity of about 35-38 watts. A detailed sizing method is described in the user manual but it is possible to use smaller or larger panels and batteries depending on the application.\n\nTypically a 25-Watt panel is paired with a 7-Ah battery for small systems (Arduino-type up to Raspberry Pi Zero type). A 35-Watt panel is paired with 9-Ah to 18-Ah batteries for larger systems. Larger batteries provide longer run-time during poor (lower light) charging conditions. A larger panel can provide more charge current during poor charging conditions.\n\nSolar panels should be a 36-cell type with a typical maximum power-point of around 18V and maximum open-circuit voltage of 23V (typically around 21-22V). Available panels and batteries I have tested with are shown below.\n\n* [25 Watt Panel](https://www.amazon.com/gp/product/B014UND3LA)\n* [35 Watt Panel](https://www.amazon.com/gp/product/B01G1II6LY)\n* [9 Ah Lead Acid Battery](https://www.amazon.com/Power-Sonic-PS-1290-Rechargeable-Battery-Terminals/dp/B002L6R130)\n* [10 Ah LiFePO4 Battery](https://www.amazon.com/ExpertPower-Lithium-Rechargeable-2500-7000-lifetime/dp/B07X3Y3LS5)\n* [18 Ah Lead Acid Battery](https://www.amazon.com/ExpertPower-EXP12180-Rechargeable-Battery-Bolts/dp/B00A82A3RK)\n\n### Enclosures\n\nI have used the [Carlon E989N](https://www.homedepot.com/p/Carlon-8-in-x-4-in-PVC-Junction-Box-E989N-CAR/100404099) enclosure found at a Home Depot home improvement store to hold the battery, charger and single-board computer. It is a good size providing room for a 7-Ah to 10-Ah battery as well as room for heat dissipation from both the charger and computer. Note that the charger can dissipate upwards of 5W when running at full capacity.\n\nOther possible enclosures include the following.\n\n* Hammond Manufacturing [RP1465/RP1465C](https://www.hammfg.com/electronics/small-case/plastic/rp)\n* Bud Industries [PIP-11774/PIP-11774-C](https://www.budind.com/view/NEMA+Boxes/NEMA+4X+-+PIP)\n\n### Questions?\n\nContact the designer - dan@danjuliodesigns.com\n'",,"2018/04/30, 04:48:29",2004,GPL-3.0,4,53,"2021/01/10, 22:43:46",2,0,2,0,1017,0,0,0.0,,,0,1,false,,false,false,,,,,,,,,,,
Tonatiuh,A Monte Carlo ray tracer for the optical simulation of solar concentrating systems.,iat-cener,https://github.com/iat-cener/tonatiuh.git,github,"solar-concentrating-systems,simulation,solar,energy",Photovoltaics and Solar Energy,"2023/03/27, 11:34:44",51,0,6,true,C++,National Renewable Energy Centre - CENER,iat-cener,"C++,QMake,C,Batchfile",http://iat-cener.github.io/tonatiuh/,"b'\r\n# News #\r\n\r\n## Tonatiuh release 2.2.4 is now available! ##\r\n\r\nIn this release some bugs have been solved. The most relevant one is a bugs in ShapeCAD instersection algorithm only visible in some operating systems. In addition, Pillbox distribution function removed from material errors in order to avoid some simulations errors by using this distribution function\r\n \r\nFind the last release following this link: https://github.com/iat-cener/tonatiuh/releases/tag/v2.2.4\r\n\r\n# Overview #\r\n_The Tonatiuh project aims to create an open source, cutting-edge, accurate, and easy to use Monte Carlo ray tracer for the optical simulation of solar concentrating systems. It intends to advance the state-of-the-art of the simulation tools available for the design and analysis of solar concentrating systems, and to make those tools freely available to anyone interested in using and improving them._\r\nSome of the most relevant design goals of Tonatiuh are:\r\n * To develop a robust theoretical foundation that will facilitate the optical simulation of almost any type of solar concentrating systems.\r\n * To exhibit a clean and flexible software architecture, that will allow the user to adapt, expand, increase, and modify its functionalities with ease.\r\n * To achieve operating system independence at source level, and run on all major platforms with none, or minor, modifications to its source code.\r\n * To provide the users with an advanced and easy-of-use Graphic User Interface (GUI).\r\nAdditional information on the rationale for this open source project, and on the goals, general characteristics, and current status of Tonatiuh is given in the two following videos. The first video is based on a [Pecha Kucha presentation](http://en.wikipedia.org/wiki/Pecha_Kucha) given by Dr. Manuel J. Blanco at the University of Seville in September 2008. Although, the audio track is in Spanish, the video is closed-captioned in Spanih, and subtitled in English. Information on how to use the close-caption/subtitle features of YouTube videos can be found [here](http://help.youtube.com/support/youtube/bin/answer.py?answer=100078). The second video is based on an oral presentation of the paper entitled ""Preliminary validation of Tonatiuh"" given by Dr. Manuel J. Blanco on September 17th, at the 2009 International Energy Agency\'s SolarPACES Symposium, celebrated in Berlin, Germany.\r\n\r\n|
|
|\r\n|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\r\n\r\n## Features ##\r\nThe use of extended Open Inventor files to represent the ""scene"" (i.e. the solar concentrating system, the sunlight model, etc.)\r\nAn advance and easy-to-use GUI providing:\r\n * 3D and tree views of the ""scene"" to simulate.\r\n * Handlers and manipulators to modify and query scene objects using 3D views.\r\n * Interface elements to manage the undo and redo of user actions.\r\n * Interface elements to define de type of Monte Carlo ray tracing to execute.\r\nA pervasive plugin architecture which allos the user to:\r\n * Add new sunlight models.\r\n * Add new geometric surfaces.\r\n * Add new reflective materials.\r\n * (planned) Add new refractive materials.\r\n * (planned) Add new photon map and other results analyzers, and post-processors.\r\n * (planned) Add new spectrum models.\r\n## Requirements ##\r\nAs any other ambitious open source program, Tonatiuh uses and leverages on several existing open source libraries, and tools. The principal open source resources used by Tonatiuh are:\r\n * Digia Qt for the Graphic User Interface (GUI).\r\n * Coin3D Toolkit for 3D Graphics Visualization.\r\n * Marble generic geographical map widget and framework.\r\n * CPPUnit for testing the code.\r\nAll these tools are used for developing Tonatiuh within the Eclipse IDE in a standard development environment used by the entire developing team.\r\n\r\n## Tonatiuh\'s output files format ##\r\n\r\nFrom version 2.0.1 the format of the outputs of simulations has been changed to be more flexible for post-processing. You can find a description of the format [Tonatiuh\'s output files format](https://github.com/iat-cener/tonatiuh/wiki/Output-files-format) .\r\n\r\n## Citing Tonatiuh ##\r\n\r\nThese are some of the most relevant references:\r\n * Les, I., Mutuberria, A., Sch\xc3\xb6ttl, P., Nitz, P. (2017). New Functionalities for the Tonatiuh Ray-tracing Software. Proceedings of the 23rd SolarPACES Conference.\r\n * J. Cardoso, J., Mutuberria, A., Marakko, C., Schoettl, P., Os\xc3\xb3rio, T., Les, I.,(2017). New Functionalities for the Tonatiuh Ray-tracing Software. Proceedings of the 23rd SolarPACES Conference.\r\n * Blanco, M., Mutuberria, A., Monreal, A., & Albert, R. (2011). Results of the empirical validation of Tonatiuh at Mini-Pegase CNRS-PROMES facility. Proc SolarPACES.\r\n * Blanco, M. J., Mutuberria, A., & Martinez, D. (2010). Experimental validation of Tonatiuh using the Plataforma Solar de Almer\xc3\xada secondary concentrator test campaign data. In 16th annual SolarPACES symposium.\r\n * Blanco, M. J., Mutuberria, A., Garcia, P., Gastesi, R., & Martin, V. (2009). Preliminary validation of Tonatiuh SOLARPACES Symposium. Berlin, Germany.\r\n * Blanco, M. J., Amieva, J. M., & Mancillas, A. (2005, January). The Tonatiuh Software Development Project: An open source approach to the simulation of solar concentrating systems. In ASME 2005 International Mechanical Engineering Congress and Exposition (pp. 157-164). American Society of Mechanical Engineers.\r\n \r\n \r\n
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.\r\n'",,"2015/05/08, 10:51:30",3092,GPL-3.0,3,1171,"2023/04/28, 14:30:58",50,1,74,5,180,0,0.0,0.07553551296505068,"2017/09/15, 07:38:25",v2.2.4,0,5,false,,false,false,,,https://github.com/iat-cener,https://www.cener.com,"Sarriguren, Spain",,,https://avatars.githubusercontent.com/u/12331888?v=4,,,
PV4GER,Aims at democratizing and accelerating the access to photovoltaic systems data in Germany and beyond.,kdmayer,https://github.com/kdmayer/3D-PV-Locator.git,github,"neurips-2020,solar,renewable-energy,pv-systems,deepsolar,computer-vision,inception-v3,deeplabv3,network-planning,solar-panels,climate-change,ai,deep-learning,remote-sensing,satellite-imagery",Photovoltaics and Solar Energy,"2023/07/06, 19:11:46",42,0,7,true,Python,,,"Python,Dockerfile",,"b'# 3D-PV-Locator\n\n\n\nRepo with [documentation](docs/_build/rinoh/pv4ger.pdf) for ""[3D-PV-Locator: Large-scale detection of rooftop-mounted photovoltaic systems in 3D](https://www.sciencedirect.com/science/article/pii/S0306261921016937?via%3Dihub)"" published in Applied Energy.\n\nIn case you would like to explore the code with which we created the image datasets and pre-processed the CityGML files, please have a look at the following [GitHub repo](https://github.com/kdmayer/CityGML-Preprocessing-Demo).\n\n## About\n\n3D-PV-Locator is a joint research initiative between [Stanford University](http://web.stanford.edu/group/energyatlas/home.html), [University of Freiburg](https://www.is.uni-freiburg.de/research/smart-cities-industries-group/smart-cities-industries-sci-group), and [LMU Munich](https://www.en.compecon.econ.uni-muenchen.de/staff/postdocs/arlt1/index.html) that aims at democratizing and accelerating the access to photovoltaic (PV) systems data in Germany and beyond. \n\nTo do so, we have developed a computer vision-based pipeline leveraging aerial imagery with a spatial resolution of\n10 cm/pixel and 3D building data to automatically create address-level and rooftop-level PV registries for all counties\nwithin Germany\'s most populous state North Rhine-Westphalia.\n\n\n\n### Address-level registry\n\nFor every address equipped with a PV system in North Rhine-Westphalia, the automatically produced address-level\nregistry in GeoJSON-format specifies the respective PV system\'s: \n\n- geometry: Real-world coordinate-referenced polygon describing the shape of the rooftop-mounted PV system\n- area_inter: The total area covered by the PV system in square meters\n- area_tilted: The total area covered by the PV system in square meters, corrected by the respective rooftop tilt\n- capacity_not_tilted_area: The total PV capacity in kWp of area_inter\n- capacity_titled_area: The total PV capacity in kWp of area_tilted \n- location of street address in latitude and longitude \n- street address\n- city and\n- ZIP code\n\n### Rooftop-level registry\n\nFor every rooftop equipped with a PV system in North Rhine-Westphalia, the automatically produced rooftop-level\nregistry in GeoJSON-format specifies the respective PV system\'s: \n\n- Azimuth: Orientation of the rooftop-mounted PV system, with 0\xc2\xb0 pointing to the North\n- Tilt: Tilt of the rooftop-mounted PV system, with 0\xc2\xb0 being flat\n- RoofTopID: Identifier of the respective rooftop\n- geometry: Real-world coordinate-referenced polygon describing the shape of the rooftop-mounted PV system\n- area_inter: The total area covered by the PV system in square meters\n- area_tilted: The total area covered by the PV system in square meters, corrected by the respective rooftop tilt\n- capacity_not_tilted_area: The total PV capacity in kWp of area_inter\n- capacity_titled_area: The total PV capacity in kWp of area_tilted\n- street address\n- city and\n- ZIP code \n\nFor a detailed description of the underlying pipeline and a case study for the city of Bottrop, please have a look at our spotlight talk at NeurIPS 2020:\n\n- [Paper](https://www.climatechange.ai/papers/neurips2020/46/paper.pdf)\n- [Slides](https://www.climatechange.ai/papers/neurips2020/46/slides.pdf)\n- [Recorded Talk](https://slideslive.com/38942134/an-enriched-automated-pv-registry-combining-image-recognition-and-3d-building-data)\n\nYou might also want to take a look at other projects within Stanford\'s EnergyAtlas initiative:\n\n- [EnergyAtlas](http://web.stanford.edu/group/energyatlas/home.html)\n- DeepSolar for Germany: [Publication](https://ieeexplore.ieee.org/document/9203258) and [Code](https://github.com/kdmayer/PV_Pipeline)\n\n## Datasets and pre-processing code are public\n\nPlease note that apart from the pipeline code and documentation, we also provide you with\n\n- A **pre-trained model checkpoint for PV classification** on aerial imagery with a spatial resolution of 10cm/pixel.\n- A **pre-trained model checkpoint for PV segmentation** on aerial imagery with a spatial resolution of 10cm/pixel.\n- A **100,000+ image dataset** for PV system classification.\n- A **4,000+ image dataset** for PV system segmentation.\n- **Pre-processed 3D building data** in .GeoJSON format for the entire state of North Rhine-Westphalia.\n\nIn case you would like to explore the code with which we created the image datasets and pre-processed the CityGML files, please have a look at the following [GitHub repo](https://github.com/kdmayer/CityGML-Preprocessing-Demo).\n\nWhen using these resources, please cite our work as specified at the bottom of this page.\n\n**NOTE**: All images and 3D building data is obtained from [openNRW](https://www.bezreg-koeln.nrw.de/brk_internet/geobasis/luftbildinformationen/aktuell/digitale_orthophotos/index.html). Labeling of the images for PV system classification and segmentation has been conducted by us.\n\n## Usage Instructions:\n\n git clone https://github.com/kdmayer/3D-PV-Locator.git\n cd 3D-PV-Locator\n\nDownload pre-trained classification and segmentation models for PV systems from our public AWS S3 bucket. This bucket is in ""requester pays"" mode, which means that you need to configure your AWS CLI before being able to download the files. Instructions on how to do it can be found [here](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html).\n\nOnce you have configured your AWS CLI with \n\n aws configure\n\nyou can list and browse our public bucket with\n\n aws s3 ls --request-payer requester s3://pv4ger/\n \nPlease download our pre-trained networks for PV system classification and segmentation by executing\n\n aws s3 cp --request-payer requester s3://pv4ger/NRW_models/inceptionv3_weights.tar models/classification/\n aws s3 cp --request-payer requester s3://pv4ger/NRW_models/deeplabv3_weights.tar models/segmentation/\n\nTo create PV registries for any county within North Rhine-Westphalia, you need to \n\n1. Download the 3D building data for your desired county from our S3 bucket by executing and replacing with a county name from the list below:\n\n aws s3 cp --request-payer requester s3://pv4ger/NRW_rooftop_data/ data/nrw_rooftop_data/\n \n Example for the county of **Essen**:\n \n aws s3 cp --request-payer requester s3://pv4ger/NRW_rooftop_data/Essen.geojson data/nrw_rooftop_data/\n\n2. Specify the name of your desired county for analysis in the config.yml next to the ""county4analysis"" element by\n choosing one of the counties from the list below:\n\n Example:\n \n county4analysis: Essen\n \n3. **OPTIONAL STEP**: Obtain your Bing API key for geocoding from [here](https://docs.microsoft.com/en-us/bingmaps/getting-started/bing-maps-dev-center-help/getting-a-bing-maps-key) and paste it in the config.yml file next to the ""bing_key"" element\n\n Example:\n \n bing_key: \n \n **NOTE**: If you leave empty, geocoding will be done by the free OSM geocoding service.\n\nOnce the data and models are in place, we build and run the docker container with all required dependencies in interactive mode and mount the /data and /log directory in the container to our local machine.\nMounting the /data and /log directories allows us to share the code outputs between the container and our local machine.\n\n docker build -t 3d_pv_docker .\n docker run -it -v /data/:/app/data/ /logs/:/app/logs/ 3d_pv_docker\n\nPlease ensure that ** corresponds to your absolute path to the 3D-PV-Locator repo on your local machine, e.g., */Users/kevin/Projects/Active/3D-PV-Locator/* in my case.\n\nNote: Depending on how many tiles you want to download, you will need to adjust the memory of your Docker container with the following flag for the docker run command:\n\n --memory=\n\nHaving the docker container in interactive mode, we can now decide which pipeline steps we want to run by putting a ""1"" next them.\n\n Example:\n \n run_tile_creator: 1\n\n run_tile_downloader: 1\n\n run_tile_processor: 1\n\n run_tile_coords_updater: 0\n\n run_registry_creator: 1\n \nIn the interactive Docker container, we then execute the pipeline with:\n\n python run_pipeline.py\n\nAfter successful completion, the resulting PV registry for your area of interest will be written to /data/pv_registry.\n\n## List of available counties:\n \nPlease choose the county you would like to run the pipeline for from the following list:\n\n- D\xc3\xbcren\n- Essen\n- Unna\n- M\xc3\xb6nchengladbach\n- Solingen\n- Dortmund\n- G\xc3\xbctersloh\n- Olpe\n- Steinfurt\n- Bottrop\n- Coesfeld\n- Leverkusen\n- K\xc3\xb6ln\n- Soest\n- M\xc3\xbclheim-a.d.-Ruhr\n- M\xc3\xbcnster\n- Heinsberg\n- Oberhausen\n- Euskirchen\n- Krefeld\n- Warendorf\n- Recklinghausen\n- Bochum\n- Rhein-Kreis-Neuss\n- Rheinisch-Bergischer-Kreis\n- Herne\n- Kleve\n- Bonn\n- Minden-L\xc3\xbcbbecke\n- Herford\n- Rhein-Sieg-Kreis\n- D\xc3\xbcsseldorf\n- Hagen\n- Paderborn\n- Wuppertal\n- Oberbergischer-Kreis\n- Viersen\n- Rhein-Erft-Kreis\n- M\xc3\xa4rkischer-Kreis\n- St\xc3\xa4dteregion-Aachen\n- Remscheid\n- Mettmann\n- Lippe\n- Ennepe-Ruhr-Kreis\n- Hochsauerlandkreis\n- Gelsenkirchen\n- H\xc3\xb6xter\n- Borken\n- Hamm\n- Bielefeld\n- Duisburg\n- Siegen-Wittgenstein\n- Wesel \n\n## OpenNRW Platform:\n\nFor the German state of North Rhine-Westphalia (NRW), OpenNRW provides:\n\n- Aerial imagery at a spatial resolution of 10cm/pixel\n- Extensive 3D building data in CityGML format\n\n## License:\n\n[MIT](https://github.com/kdmayer/PV_Pipeline/blob/master/LICENSE)\n\n## BibTex Citation:\n\nPlease cite our work as\n\n @article{MAYER2022,\n title = {3D-PV-Locator: Large-scale detection of rooftop-mounted photovoltaic systems in 3D},\n journal = {Applied Energy},\n volume = {310},\n pages = {118469},\n year = {2022},\n issn = {0306-2619},\n doi = {https://doi.org/10.1016/j.apenergy.2021.118469},\n url = {https://www.sciencedirect.com/science/article/pii/S0306261921016937},\n author = {Kevin Mayer and Benjamin Rausch and Marie-Louise Arlt and Gunther Gust and Zhecheng Wang and Dirk Neumann and Ram Rajagopal},\n keywords = {Solar panels, Renewable energy, Image recognition, Deep learning, Computer vision, 3D building data, Remote sensing, Aerial imagery},\n }\n'",",https://doi.org/10.1016/j.apenergy.2021.118469","2021/01/20, 12:47:54",1008,MIT,2,109,"2023/07/06, 18:49:09",0,3,5,1,111,0,0.0,0.0,,,0,1,false,,false,false,,,,,,,,,,,
PV Free,A public API for PV modeling parameters.,BreakingBytes,https://github.com/BreakingBytes/pvfree.git,github,"api,python,solar-energy",Photovoltaics and Solar Energy,"2023/08/10, 03:59:55",18,0,3,true,Python,Breaking Bytes,BreakingBytes,"Python,HTML,Procfile",https://pvfree.azurewebsites.net/,"b'[](https://travis-ci.org/BreakingBytes/pvfree)\n\nPV Free\n=======\nA public API for PV modeling parameters and pvlib API for learning about solar.\n\nAnnouncements\n-------------\npvfree is moving to Microsoft Azure Cloud b/c Heroku free dyno service will end\nNov 28th. Please use https://pvfree.azurewebsites.net/ from now on to get module\nand inverter parameters for pvlib and to learn about solar energy modeling.\n\nUsage\n-----\nBrowsing to\n[`pvfree.azurewebsites.net/api/v1/pvinverter/`](https://pvfree.azurewebsites.net/api/v1/pvinverter/?format=json)\nwill display a JSON string with the first 20 records. The endpoint and query\nstring to obtain the next set\n[`api/pvinverter/?limit=20&offset=20`](https://pvfree.azurewebsites.net/api/v1/pvinverter/?format=json&limit=20&offset=20)\nis contained in the `next` key of the string as are the endpoints for each\ninverter. Note: the query string `?format=json` is only necessary when using the API url directly in a browser to display the response.\n\n[Tastypie](https://django-tastypie.readthedocs.org/en/latest/)\n--------------------------------------------------------------\nThe API is generated by the Tastypie django extension. Add the following endpoints to the base URL, [`https://pvfree.azurewebsites.net/`](https://pvfree.azurewebsites.net/):\n\n* Get first 20 pvinverters.\n\n api/pvinverter/\n\n* Get first pvinverter.\n\n api/pvinverter/1/\n\n* Get pvinverter set containing #\'s 2, 3, 5, and 10.\n\n api/pvinverter/set/2;3;5;10/\n\n* Get 100 pvinverters starting from pvinverter # 500.\n\n api/pvinverter/?limit=100&offset=500\n\n* Get pvinverter database schema.\n\n api/pvinverter/schema/\n\n[Python Requests](https://requests.readthedocs.io/en/master/)\n-------------------------------------------------------------\nPython has several libraries for interacting with URLs. The Requests package is available from [PyPI](https://pypi.python.org/pypi/requests).\n\n```python\n>>> import requests\n>>> response = requests.get(\'https://pvfree.azurewebsites.net/api/v1/pvinverter/set/1;3;5/\')\n>>> response\n \n>>> response.status_code\n 200\n>>> response.content\n {""objects"": [{""C0"": -2.48104842861e-05, ""C1"": -9.0149429405099999e-05, ""C2"": 0.00066889632690700005, ""C3"": -0.018880466688599998, ""Idcmax"": 10.0, ""MPPT_hi"": 50.0, ""MPPT_low"": 20.0, ""Paco"": 250.0, ""Pdco"": 259.52205054799998, ""Pnt"": 0.02, ""Pso"": 1.7716142241299999, ""Sandia_ID"": 1399, ""Tamb_low"": -40.0, ""Tamb_max"": 85.0, ""Vaco"": 208.0, ""Vdcmax"": 65.0, ""Vdco"": 40.242603174599999, ""id"": 1, ""manufacturer"": ""ABB"", ""name"": ""MICRO-0.25-I-OUTD-US-208"", ""numberMPPTChannels"": 1, ""resource_uri"": ""/api/v1/pvinverter/1/"", ""source"": ""CEC"", ""vintage"": ""2014-01-01"", ""weight"": 1.6499999999999999}, ...]}\n```\n'",,"2015/02/28, 08:09:30",3161,BSD-2-Clause,13,251,"2022/11/23, 09:26:16",6,24,34,4,336,0,0.0,0.03167420814479638,,,0,3,false,,true,false,,,https://github.com/BreakingBytes,http://breakingbytes.github.io/,"Oakland, CA",,,https://avatars.githubusercontent.com/u/1437576?v=4,,,
Pysolar,A collection of Python libraries for simulating the irradiation of any point on earth by the sun. It includes code for extremely precise ephemeris calculations.,pingswept,https://github.com/pingswept/pysolar.git,github,,Photovoltaics and Solar Energy,"2023/09/25, 22:37:43",348,0,37,true,Jupyter Notebook,,,"Jupyter Notebook,Python",http://pysolar.org,"b'## Pysolar ##\n\n[](https://github.com/pingswept/pysolar/actions/workflows/testpackage.yml)\n\nPysolar is a collection of Python libraries for simulating the irradiation of any point on earth by the sun. It includes code for extremely precise ephemeris calculations, and more.\n\n# Note: right now, the latest commits of Pysolar don\'t work with Python 2.x #\n\n* Release 0.6 works with 2.x: https://github.com/pingswept/pysolar/releases/tag/0.6 but 0.7 and later have a bunch of changes. They have been validated for Python 3.4, but releases 3.2 or earlier are missing features that the changes require. *\n\nAlso, the API has changed slightly:\n\n * Pysolar now expects you to supply a **timezone-aware datetime**, rather than a naive datetime in UTC. If your results seem crazy, this is probably why.\n * Function names are now `lowercase_separated_by_underscores`, in compliance with [PEP8](https://www.python.org/dev/peps/pep-0008/#function-names).\n\n## Installation ##\n\nAssuming you have Python 3.4 or higher installed, you can install Pysolar with `pip`:\n\n sudo pip install pysolar\n\nDocumentation now appears at [docs.pysolar.org](http://docs.pysolar.org).\n\n## Contributions ##\n\nAll contributions go through pull requests on Github.\n\nEditing [the documentation](http://docs.pysolar.org) is particularly easy-- just click the ""Edit on Github"" link at the top of any page.\n\nCode contributions are welcome under the terms of the GPLv3 license. If you\'re unfamiliar with Github, you could start with [this guide to working on open source projects](https://guides.github.com/activities/contributing-to-open-source/).\n\n## Support ##\n\nYour first move should be to read the [documentation](http://docs.pysolar.org) and think. But you\'ve probably already tried that.\n\nYour second move is to ask a question on the [pysolar-discuss](http://lists.pysolar.org) mailing list. The original author of Pysolar, [Brandon Stafford](http://rascalmicro.com), monitors the mailing list. Please understand that I wrote (most of) Pysolar around a decade ago when I worked in the solar industry. Now, I\'m an electrical engineer who just maintains Pysolar as a fun hobby. The other people on the list are other users like you-- some are experts; some are amateurs. None of them are getting paid for this.\n\nTo subscribe to the mailing list, send a blank email to pysolar-discuss+subscribe@pysolar.org. After a few minutes, you\'ll get a confirmation message; reply to the confirmation to complete the subscription.\n\nIn order to post to the list, you have to subscribe. You also have to pass a threshold of civil discourse regulated by me.\n\nThe archive of the list is publicly available here: http://lists.pysolar.org/.\n\nIf you ever want to unsubscribe, send an email to pysolar-discuss+unsubscribe@pysolar.org and reply to the confirmation message. If you run into trouble, just send me an email at brandon@pingswept.org, and I\'ll remove your address manually.\n\nPlease report bugs to [the issue tracker on Github](https://github.com/pingswept/pysolar/issues); I am automatically notified when a new issue is opened.\n\n## License ##\n\nPysolar is licensed under [the GPLv3](https://www.gnu.org/licenses/gpl-3.0.html).\n'",,"2008/03/01, 23:35:48",5715,GPL-3.0,15,364,"2023/09/25, 22:37:44",14,73,144,10,29,0,0.1,0.41509433962264153,"2023/07/25, 17:13:16",0.11,0,41,false,,false,false,,,,,,,,,,,
PV_ICE,"An open-source tool to quantify Solar Photovoltaics (PV) Energy and Mass Flows in the Circular Economy, from a Reliability and Lifetime approach.",NREL,https://github.com/NREL/PV_ICE.git,github,"circular-economy,circularity-metrics,reliability,solar-energy,circularity,mass-flow,repair,reuse,recycle,photovoltaics,lifetime",Photovoltaics and Solar Energy,"2023/10/25, 00:35:04",27,0,8,true,Python,National Renewable Energy Laboratory,NREL,Python,https://pv-ice.readthedocs.io/en/latest/,"b'
\n\n\n\n | Version | \n \n \n | \n
\n\n\n | License | \n \n \n \n \n | \n
\n\n | Documentation | \n \n\t\n\t \n\t\n | \n
\n
\n\n\n# PV ICE: PV in the Circular Economy, a Dynamic Energy and Materials Tool\n\nThis open-source tool explores the effects of Circular Economy (CE) pathways for photovoltaic (PV) materials. It can be used to quantify and assign a value framework to CE efforts including re-design, reduction, replacement, reuse, recycling, and lifetime and reliability improvements across the PV value chain. PV ICE enables tradeoff analysis through scenario comparisons, and is highly customizable through user inputs such as deployment schedules, module properties and component materials, and CE pathways.\n\nThe provided PV ICE module and material baselines leverage published data from many sources on PV manufacturing and predicted technological changes. Input data are being compiled [here](https://docs.google.com/spreadsheets/d/1WV54lNAdA2uP6a0g5wMOOE9bu8nbwvnQDgLj3GuGojE/edit?usp=sharing) and the baselines are available here for use in other projects as well as for the PV ICE tool.\n\n\nHow it Works\n=============\n\nThis section provides a brief description of how the PV ICE tool works. FULL DOCUMENTATION CAN BE FOUND AT [readthedocs](https://pv-ice.readthedocs.io/en/latest/?badge=latest).\n\n\nMass\n-----\n\nPV ICE is a dynamic mass flow based tool. It takes in any deployment forecast of any evolving module design along with it\'s component materials and uses sophisticated lifetime and reliability parameters to calculate effective capacity, virgin material demand, and life cycle wastes. The calculator captures all the mass flows shown in the simplified diagram below for all years studied in a simulation (ex: 2020-2050). \n\n
\n\nAnnually deployed cohorts of modules are tracked through the simulation, subjected to lifetime, degradation, and reliability parameters, and guided along user defined CE pathways (ex: resell, recycling). The PV ICE framework is designed for scenario comparisons (ex: different deployment schedules, module designs, or circular pathways) and is capable of both geospatial and temporal analysis (i.e. when and where materials will be demanded or are available).\n\nModule and material properties are known to be variable with time, and PV ICE can capture this dynamic evolution of PV technology. Dynamic baseline inputs for crystalline silicon PV modules and component materials are provided in the PV_ICE \\ baselines folder. These baselines are dervied from [literature and report data](https://docs.google.com/spreadsheets/d/1Ec5JRBSN2NFXjEABgUp1ch-EG6uQao8j5Rk1MLuZZYI/edit?usp=sharing). Module baselines capture the annual average crystalline silicon module (i.e. a market share weighted average of the silicon PV technologies deployed). Each material similarly is a market share weighted average of silicon PV technologies, compiled from multiple sources, most notably consistent with ITRPV data. Please see the Jupyter Journals (tutorials \\ baseline development documentation) for the derivations and sources (baselines \\ SupportingMaterials) of the provided c-Si baselines. Alternate module and material files can be created by the user, and an expanded set of PV technology baselines is planned for the future, including CdTe and perovskites.\n\n\nEnergy\n-------\n\nThe energy balance of renewable energy technologies is as important and the mass balance when evaluating sustainability. Additionally, few studies of Circular Economy (CE) pathways consider the energy return on investment of a particular pathway. PV ICE energy flows fill this analysis gap, and provide useful insights into the potential tradeoffs between mass and energy of CE pathways.\n\nThe energy flows of PV ICE are based on the mass flows. These energy flows, like the mass flows, are dynamic with time and are seperated into module and material energies. For each supply chain process step captured in the mass flows, an energy per module area or energy per material mass is captured as an input (ex: module manufacturing energy, energy to manufacture rolled glass from silica sand, energy to crush a module for recycling ). The energy demanded for each step is the sum of all electrical energy demands and all fuel/heating energy demands. \n\nWe provide an energy baseline for crystalline silicon modules and component materials. Data for these baselines is being compiled from [literature and report data](https://docs.google.com/spreadsheets/d/1Ec5JRBSN2NFXjEABgUp1ch-EG6uQao8j5Rk1MLuZZYI/edit?usp=sharing). For the complete derivation of the energy demands for crystalline silicon modules and materials, please see the Jupyter Journals (tutorials \\ baseline development documentation) and (baselines \\ SupportingMaterials). Alternate module and material files can be created by the user, and an expanded set of PV technology baselines is planned for the future, including CdTe and perovskites.\n\nAfter running a mass flow simulation, an energy flow calculation can be run which will multiply the energy demands by the mass flows and calculate annual generation from the deployed modules. Results of this calculation provide annual, cumulative, and lifetime energy demands and energy generated. These values can be used to calculate energy balance metrics such as energy return on investment (EROI), net energy, and energy payback time (EPBT). These features are actively under development, so check back for updates soon!\n\n\nInstallation for PV ICE\n=======================\n\nPV ICE releases may be installed using the ``pip`` and ``conda`` tools.\nPlease see the [Installation page](http://PV_ICE.readthedocs.io/en/latest/installation.html) of the documentation for complete instructions.\n\nPV ICE is compatible with Python 3.5 and above.\n\nInstall with:\n\n pip install PV_ICE\n\nFor developer installation, download the repository, navigate to the folder location and install as:\n\n pip install -e .\n\n\nHow to Get Started\n===================\n\nAfter you have installed PV ICE, we recommend heading over to our tutorials jupyter journals (PV ICE \\ docs \\ tutorials). There you will find journals [""0 - quick start Example""](https://github.com/NREL/PV_ICE/blob/development/docs/tutorials/0%20-%20quickStart%20Example.ipynb) and [""1 - Beginner Example""](https://github.com/NREL/PV_ICE/blob/development/docs/tutorials/1%20-%20Beginner%20Example.ipynb) which can help guide you through your first simulation using the PV ICE provided crystalline silicon PV baselines. In journals 2-4 we walk you through modifications to the basic simulation, including modifying parameters with PV ICE functions to suit your analysis needs.\n\n\nSome Analyses Featuring/Leveraging PV ICE\n==========================================\n\nPV ICE has been used in a variety of published analyses, including:\n\n**High Impact Report: The Solar Futures Report and Circular Economy Technical Report**\n\n Ardani, Kristen, Paul Denholm, Trieu Mai, Robert Margolis, \n Eric O\xe2\x80\x99Shaughnessy, Timothy Silverman, and Jarett Zuboy. 2021. \n \xe2\x80\x9cSolar Futures Study.\xe2\x80\x9d EERE DOE. \n https://www.energy.gov/eere/solar/solar-futures-study.\n\n Heath, Garvin, Dwarakanath Ravikumar, Silvana Ovaitt, \n Leroy Walston, Taylor Curtis, Dev Millstein, Heather Mirletz, \n Heidi Hartman, and James McCall. 2022. \n \xe2\x80\x9cEnvironmental and Circular Economy Implications of Solar Energy\n in a Decarbonized U.S. Grid.\xe2\x80\x9d NREL/TP-6A20-80818. NREL.\n\n**Peer Reviewed Journals**\n\n H. Mirletz, S. Ovaitt, S. Sridhar, and T. M. Barnes. 2022. \n \xe2\x80\x9cCircular Economy Priorities for Photovoltaics in the Energy Transition.\xe2\x80\x9d \n PLOS ONE 17 (9): e0274351. https://doi.org/10.1371/journal.pone.0274351.\n\n S. Ovaitt & H. Mirletz, S. Seetharaman, and T. Barnes, \n \xe2\x80\x9cPV in the Circular Economy, A Dynamic Framework Analyzing \n Technology Evolution and Reliability Impacts,\xe2\x80\x9d \n ISCIENCE, Jan. 2022, doi: https://doi.org/10.1016/j.isci.2021.103488.\n\n\nThere are other multiple publications citing PV ICE like PVSC, PVRW, etc. Please see the list in the [readthedocs](http://CircularEconomy-MassFlowCalculator.readthedocs.io/en/latest/) documentation.\n \n\n\nContributing\n============\n\nWe need your help to make PV ICE a great tool!\nPlease see the [Contributing page](http://PV_ICE.readthedocs.io/en/stable/contributing.html) for more on how you can contribute.\nThe long-term success of PV ICE requires substantial community support.\n\n\nLicense\n=======\n\nPV_ICE open-source code is copyrighted by the Alliance for Sustainable Energy and licensed with BSD-3-Clause terms, found [here](https://github.com/NREL/PV_ICE/blob/main/LICENSE.md).\n\n\nGetting support\n===============\n\nIf you suspect that you may have discovered a bug or if you\'d like to\nchange something about CF-MFA, then please make an issue on our\n[GitHub issues page](https://github.com/NREL/PV_ICe/issues).\n\n\nCiting\n======\n\nIf you use PV_ICE in a published work, please cite:\n\n S. Ovaitt & H. Mirletz, S. Seetharaman, and T. Barnes, \n \xe2\x80\x9cPV in the Circular Economy, A Dynamic Framework Analyzing \n Technology Evolution and Reliability Impacts,\xe2\x80\x9d \n ISCIENCE, Jan. 2022, doi: https://doi.org/10.1016/j.isci.2021.103488.\n\n\nand also please also cite the DOI corresponding to the specific version of\nPV_ICE that you used. PV_ICE DOIs are listed at\n[Zenodo.org](https://zenodo.org/badge/latestdoi/248347431). For example for version 0.3.2:\n\n\tS. Ovaitt, H. Mirletz, M. Mendez Ribo (2023). \n\tNREL/PV_ICE: v0.3.2 Release. Zenodo. \n\thttps://doi.org/10.5281/zenodo.7651576\n'",",https://zenodo.org/badge/latestdoi/248347431,https://doi.org/10.1371/journal.pone.0274351.\n\n,https://doi.org/10.1016/j.isci.2021.103488.\n\n\nThere,https://doi.org/10.1016/j.isci.2021.103488.\n\n\nand,https://zenodo.org/badge/latestdoi/248347431","2020/03/18, 21:31:34",1315,CUSTOM,132,934,"2023/10/25, 00:39:36",4,18,22,5,0,0,0.1,0.5184729064039408,"2023/02/18, 00:07:12",v0.3.2,0,7,false,,false,false,,,https://github.com/NREL,http://www.nrel.gov,"Golden, CO",,,https://avatars.githubusercontent.com/u/1906800?v=4,,,
Solar electricity Nowcasting,Build the world's best near-term forecasting system for solar electricity generation.,openclimatefix,https://github.com/openclimatefix/nowcasting.git,github,nowcasting,Photovoltaics and Solar Energy,"2023/10/16, 11:44:00",85,0,35,true,TypeScript,Open Climate Fix,openclimatefix,"TypeScript,JavaScript,Python,CSS,Shell",https://openclimatefix.org/projects/nowcasting/,"b'# Solar Electricity Nowcasting\n\n[](#contributors-)\n\n\nThis is a ""meta-repository"" for [Open Climate Fix](https://openclimatefix.org/)\'s solar electricity nowcasting project. See [this great Wired article about OCF\'s solar electricity forecasting work](https://www.wired.co.uk/article/solar-weather-forecasting) for a good intro to solar electricity nowcasting.\n\nThe plan is to enable the community to build the world\'s best near-term forecasting system for solar electricity generation, and then let anyone use it! :) We\'ll do this by using state-of-the-art machine learning and 5-minutely satellite imagery to predict the movement of clouds over the next few hours, and then use this to predict solar electricity generation.\n\nThe term ""nowcasting"" just means ""forecasting for the next few hours using statistical techniques"".\n\n# Why is all this stuff open-source?\n\nIn OCF, we\'re curious to see if it\'s possible to rapidly mitigate climate change by:\n\n1. Enabling thousands of people to help solve ML problems which, if solved, might help reduce CO2 emissions\n2. Running small(ish) pilot projects to implement the best solution in industry\n3. Enabling thousands of practitioners to use the code in their products.\n\n# What\'s the likely climate impact?\n\nIt\'s really, really, _really_ hard to estimate climate impact of forecasting! But, as a super-rough back-of-the-envelope calculation, we estimate that better solar forecasts, if rolled out globally, could reduce CO2 emissions by about a billion tonnes between now and 2035.\n\n# Getting involved\n\n- [List of ""good first issues""](https://github.com/search?l=&p=1&q=user%3Aopenclimatefix+label%3A%22good+first+issue%22&ref=advsearch&type=Issues&utf8=%E2%9C%93&state=open): GitHub ""issues"" which describe changes we\'d like to make to the code.\n- [OCF\'s coding style](https://github.com/openclimatefix/nowcasting/blob/main/coding_style.md)\n- The main tools we use include: PyTorch, PyTorch Lighting, xarray, pandas, pvlib\n\n# Overview of OCF\'s nowcasting repositories\n\n## Downloading data & getting the data in the right shape for ML experiments\n\n- [nowcasting_dataset](https://github.com/openclimatefix/nowcasting_dataset): Pre-prepares ML training batches. Loads satellite data, numerical weather predictions, solar PV power generation timeseries, and other datasets. Outputs pre-prepared ML training batches as NetCDF files (one batch per NetCDF file).\n- [Satip](https://github.com/openclimatefix/Satip): Retrieve, transform and store EUMETSAT data.\n- [pvoutput](https://github.com/openclimatefix/pvoutput): Python code for downloading PV data from [PVOutput.org](https://PVOutput.org).\n\n### Older code (no longer maintained)\n\n- [satellite_image_processing](https://github.com/openclimatefix/satellite_image_processing)\n- [eumetsat](https://github.com/openclimatefix/eumetsat): Tools for downloading and processing satellite images from EUMETSAT\n\n## Machine Learning\n\n### Main repositories for our experiments:\n\n- [satflow](https://github.com/openclimatefix/satflow): Satellite Optical Flow with machine learning models. Predicting the next few hours of satellite imagery from the recent history of satellite imagery (and other data sources).\n- [predict_pv_yield](https://github.com/openclimatefix/predict_pv_yield): Using optical flow (and the output of satflow) & machine learning to predict solar PV yield (i.e. to predict the power generated by solar electricity systems over the next few hours). An older set of experiments is in [predict_pv_yield_OLD](https://github.com/openclimatefix/predict_pv_yield_OLD), which is no longer maintained..\n- [nowcasting_utils](https://github.com/openclimatefix/nowcasting_utils): Forecasting performance metrics, plotting functions, loss functions, etc.\n- [nowcasting_dataloader](https://github.com/openclimatefix/nowcasting_dataloader): PyTorch dataloader for taking pre-prepared batches from `nowcasting-dataset` and getting them into our models.\n\n### PyTorch implementations of ML models from the literature\n\n- [MetNet](https://github.com/openclimatefix/metnet): PyTorch Implementation of Google Research\'s MetNet ([S\xc3\xb8nderby et al. 2020](https://arxiv.org/abs/2003.12140)), inspired from Thomas Capelle\'s [metnet_pytorch](https://github.com/tcapelle/metnet_pytorch/tree/master/metnet_pytorch).\n- [skillful_nowcasting](https://github.com/openclimatefix/skillful_nowcasting): Implementation of DeepMind\'s Skillful Nowcasting GAN ([Ravuri et al. 2021](https://arxiv.org/abs/2104.00954)) in PyTorch Lightning.\n- [perceiver-pytorch](https://github.com/openclimatefix/perceiver-pytorch): Implementation of DeepMind\'s Perceiver ([Jaegle et al. 2021](https://arxiv.org/abs/2103.03206)) and Perceiver IO ([Jaegle et al. 2021](https://arxiv.org/abs/2107.14795)) in Pytorch. Forked from [lucidrains/perceiver-pytorch](https://github.com/lucidrains/perceiver-pytorch).\n\n### Older code (no longer maintained)\n\n- [solar-power-mapping-data](https://github.com/openclimatefix/solar-power-mapping-data): Code to create rich harmonised geographic data for PV installations from OpenStreetMap and other sources. Mostly by Dan Stowell, The Turing Institute, and Sheffield Solar. The code behind the 2020 paper [""A harmonised, high-coverage, open dataset of solar photovoltaic installations in the UK""](https://www.nature.com/articles/s41597-020-00739-0) by Stowell et al.\n- [predict_pv_yield_OLD](https://github.com/openclimatefix/predict_pv_yield_OLD)\n- [predict_pv_yield_NWP](https://github.com/openclimatefix/predict_pv_yield_nwp): Build a baseline model for predicting PV yield using NWP (numerical weather predictions), as opposed to satellite imagery. This model is intentionally very simple, so we can get an end-to-end system up and running quickly to interate on.\n- [metoffice_ec2](https://github.com/openclimatefix/metoffice_ec2): Extract specific parts of the [UK Met Office\'s UKV and MOGREPS-UK numerical weather predictions from AWS](https://registry.opendata.aws/uk-met-office/), compress, and save to S3 as Zarr. Intended to run on AWS EC2.\n- [metoffice_aws_lambda](https://github.com/openclimatefix/metoffice_aws_lambda): Simple AWS Lambda function to extract specific parts of the UK Met Office\'s UKV and MOGREPS-UK numerical weather predictions, compress, and save to S3 as Zarr. (We found that AWS Lambda is not a good fit for this task because we actually have to do a bit of heavy-lifting, which gets very expensive on Lambda!)\n\n## Operational solar nowcasting\n\n- [nowcasting_api](https://github.com/openclimatefix/nowcasting_api): API for hosting nowcasting solar predictions. Will just return \'dummy numbers\' until about mid-2022!\n\nFor a complete list of all of OCF\'s repositories tagged with ""nowcasting"", see [this link](https://github.com/search?l=&o=desc&q=topic%3Anowcasting+org%3Aopenclimatefix&s=updated&type=Repositories)\n\n## Contributors \xe2\x9c\xa8\n\nThanks goes to these wonderful people ([emoji key](https://allcontributors.org/docs/en/emoji-key)):\n\n\n\n\n\n\n\n\n\n\n\nThis project follows the [all-contributors](https://github.com/all-contributors/all-contributors) specification. Contributions of any kind welcome!\n'",",https://arxiv.org/abs/2003.12140,https://arxiv.org/abs/2104.00954,https://arxiv.org/abs/2103.03206,https://arxiv.org/abs/2107.14795","2020/12/15, 15:40:01",1044,MIT,328,818,"2023/10/18, 12:13:52",48,160,372,138,7,9,2.5,0.6124197002141327,,,1,12,false,,false,false,,,https://github.com/openclimatefix,openclimatefix.org,London,,,https://avatars.githubusercontent.com/u/48357542?v=4,,,
Solar Forecast Arbiter,"Core data gathering, validation, processing, and reporting package for the Solar Forecast Arbiter.",SolarArbiter,https://github.com/SolarArbiter/solarforecastarbiter-core.git,github,,Photovoltaics and Solar Energy,"2023/01/24, 23:03:27",31,4,10,true,Python,Solar Forecast Arbiter,SolarArbiter,"Python,Jinja,HTML,TeX,Dockerfile,C",https://solarforecastarbiter-core.readthedocs.io,"b'[](https://github.com/SolarArbiter/solarforecastarbiter-core/actions)\n[](https://lgtm.com/projects/g/SolarArbiter/solarforecastarbiter-core/alerts/)\n[](https://codecov.io/gh/solararbiter/solarforecastarbiter-core)\n[](https://solarforecastarbiter-core.readthedocs.io/en/latest/?badge=latest)\n[](https://doi.org/10.5281/zenodo.3473590)\n[](https://quay.io/repository/solararbiter/solarforecastarbiter-core)\n\n# solarforecastarbiter-core\nCore Solar Forecast Arbiter data gathering, validation, processing, and\nreporting package.\n\n# Installation\n\nSee the [installation](https://solarforecastarbiter-core.readthedocs.io/en/latest/installation.html) instructions in the documentation.\n\n# Documentation\n\nThe documentation is hosted at [solarforecastarbiter-core.readthedocs.io](https://solarforecastarbiter-core.readthedocs.io/en/latest/)\n\n# Contributing\n\nWe welcome your contributions. Please see our [contributing guide](https://solarforecastarbiter-core.readthedocs.io/en/latest/contributing.html).\n\n# Architecture\n\nThe Solar Forecast Arbiter consists of the Dashboard, API, and this Core package.\nSee [solarforecastarbiter.org/documentation](https://solarforecastarbiter.org/documentation/)\nfor descriptions of each project and how they work together.'",",https://doi.org/10.5281/zenodo.3473590","2019/01/23, 17:31:36",1736,MIT,19,493,"2023/06/30, 07:18:12",151,350,661,7,117,12,1.8,0.6616257088846881,"2022/02/11, 21:26:33",1.0.13,0,7,false,,false,true,"Njadrick/solar-dashboard-,SolarArbiter/workshop,SolarArbiter/solarforecastarbiter-dashboard,SolarArbiter/solarforecastarbiter-api",,https://github.com/SolarArbiter,https://forecastarbiter.epri.com,United States of America,,,https://avatars.githubusercontent.com/u/43686373?v=4,,,
pv-system-profiler,Estimating PV array location and orientation from real-world power datasets.,slacgismo,https://github.com/slacgismo/pv-system-profiler.git,github,,Photovoltaics and Solar Energy,"2021/10/13, 19:41:56",8,3,2,false,Jupyter Notebook,SLAC GISMo,slacgismo,"Jupyter Notebook,Python,Shell",,"b'# pv-system-profiler\n### Estimating PV array location and orientation from real-world power datasets.\n\n\n\n | Latest Release | \n \n \n \n \n \n \n \n \n \n \n |
\n\n | License | \n \n \n \n \n | \n
\n\n | Build Status | \n \n \n \n \n | \n
\n\n | Code Quality | \n \n \n \n \n \n \n \n | \n
\n\n | Publications | \n \n \n \n \n | \n
\n\n | PyPI Downloads | \n \n \n \n \n | \n
\n\n | Conda Downloads | \n \n \n \n \n | \n
\n
\n\n## Install & Setup\n\n#### 1) Recommended: Set up `conda` environment with provided `.yml` file\n\nWe recommend setting up a fresh Python virtual environment in which to use `pv-system-profiler`. We recommend using the [Conda](https://docs.conda.io/projects/conda/en/latest/index.html) package management system, and creating an environment with the environment configuration file named `pvi-user.yml`, provided in the top level of this repository. This will install the `statistical-clear-sky` and `solar-data-tools` packages as well.\n\nCreating the env:\n\n```bash\n$ conda env create -f pvi-user.yml\n```\n\nStarting the env:\n\n```bash\n$ conda activate pvi_user\n```\n\nStopping the env\n\n```bash\n$ conda deactivate\n```\n\nAdditional documentation on setting up the Conda environment is available [here](https://github.com/slacgismo/pvinsight-onboarding/blob/main/README.md).\n\n\n#### 2) PIP Package\n\n```sh\n$ pip install pv-system-profiler\n```\n\nAlternative: Clone repo from GitHub\n\nMimic the pip package by setting up locally.\n\n```bash\n$ pip install -e path/to/root/folder\n```\n\n#### 3) Anaconda Package\n\n```sh\n$ conda install -c slacgismo pv-system-profiler\n```\n\n## Solver Dependencies\n\nRefer to [solar-data-tools](https://github.com/slacgismo/solar-data-tools) documentation to get more info about solvers being used.\n\n## Usage / Run Scripts\n### Serial run\nThe `parameter_estimation_script.py` script creates a report of all systems based on the `csv` files with the system signals located in a given folder.\nThe script takes all input parameters as `kwargs`. The example below illustrates the use of report_script:\n```shell\npython \'repository location of run script\'/parameter_estimation_script.py report None all \ns3://s3_bucket_with_signals/ \'repeating_part_of label\' /home/results.csv True False \nFalse False s3://\'s3_path_to_file_containing_metadata/metadata.csv\' None s3\n```\nIn the example above the full path to `parameter_estimation_script.py` is specified to run a\n`report`. The script allows to provide a `csv` file with list of sites to be analyzed. In this case no list is provided \nand therefore the `kwarg` `None` is entered. The script also allows to run an analysis on the first `n_files` containing \ninput signals in the `s3` repository. In this, case the `all` `kwarg` specifies that all input signals are to be analyzed. \nIn this example, all `csv` files containing the input signals are located in the `s3` bucket with the name \n`s3://s3_bucket_with_signals/`. Usually these `csv` files are of the form `ID_repeating_part_of_label.csv`, for example:\n`1_composite_10.csv`, `2_composite_10.csv`, where `_composite_10` is the repeating part of the label. The repeating part \nof the label is either None or a string as in the example above. Next, an absolute path to the desired location of the \nresults file is provided, in this case `/home/results.csv`. The two following `kwargs` are type Boolean and are used to set the \nvalues of the `correct_tz` and `fix_shifts` pipeline `kwargs`. The next `kwarg`, `check_json` is also Boolean. It \nis used to indicate if there is a `json` file present in `s3://s3_bucket_with_signals/` with additional site information \nthat is to be analyzed. The next Boolean `kwarg` is used to set the `convert_to_ts` `kwarg` when instantiating the data \nhandler. The next `kawrg` contains the full path to the `csv` file containing site metadata, here called `metadata.csv`. \nThe information that this file should contain varies depending on the `estimation` to be performed. This file is \noptional and the `kwarg` can be set to `None`. For the case of a `report`, a `csv` file with columns labeled `site`, \n`system` and `gmt_offset` and their respective values need to be provided. Alternatively, if the `gmt_offset` `kwarg`, \nthe next `kwarg` (in the example above set to `None`), has a numeric \nvalue different to `None`, all sites will use that single value when running the report. For the case of the `report` \nestimation, the metadata file should contain `site`, `system` and `gmt_offset` columns with the respective\nvalues for each system. For the case of the `longitude` estimation, the metadata file should contain `site`, `system` \nand `latitude` columns with the respective values for each system. For the case of the `tilt_azimuth` estimation, the \nmetadata file should contain `site`, `system`, `gmt_offset`, `estimated_longitude` and `estimated_latitude`, `tilt`, \n`azimuth` columns and with the respective values for each system. Additionally, if a manual inspection for time shifts \nwas performed, another \ncolumn labeled `time_shift_manual` having a zero for systems with no time shift and ones for systems with time shift\nmay be included. If a `time_shift_manual` column is included, it will be used to determine whether the `fix_dst()` \nmethod is run after instantiating the data handler. The next `karg is` `gmt_offset` and in this case it is set to None. \nThe last kwarg corresponds to the `data_source`. In this case the value is `s3` since files with the input signals are\nlocated in an `s3` bucket.\n ## Partitioned run\nA script that runs the site report, the longitude, latitude and tilt and azimuth scripts using a number of prescribed \nAmazon Web Services (AWS), instances is provided. The script reads the folder containing the system signals and \npartitions these signals to run in a `n` user prescribed AWS instances in parallel. Here is an example shell command \nfor a partitioned run:\n```shell\npython \'repository location of run script\'/run_partition_script.py parameter_estimation_script.py report None all \ns3://s3_bucket_with_signals/ \'repeating_part_of label\' /home/results.csv True False \nFalse False s3://\'s3_path_to_file_containing_metadata/metadata.csv\' None s3\n\'repository location of run script\'/parameter_estimation_script.py pvi-dev my_instance\n ```\nwhere the individual value of each kwarg are defined in run_partition_script.py. This script takes the same inputs as\nthe `parameter_estimation_script.py` plus three additional parameters. Note that the first kwarg is the partitioning \nscript repository location of run script `/run_partition_script.py parameter_estimation_script.py`. The estimation run\nscript `/parameter_estimation_script.p` is specified as the third to last kwarg. The second to last kwarg is the conda\nenviroment to be used to run the estimation, in this case `pvi-dev`. The last kwarg is the name of the AWS instances to\nbe used to run `run_partition_script.py`, in this case `my_instance`. Previous to running this command it is necessary to create `n` identical AWS \ninstances that correspond to the number of desired partitions. These instances need to have the same \n`Name=\'instance name\'` AWS tag. The simplest way to accomplish this is by parting from an AWS image of a previously \nconfigured instance. This image needs to have all the repositories and conda environments that \nwould be needed in a serial run. Once each partitioned run is finished, results will be automatically collected in the \nlocal folder where `run_partition_script.py` was run. \n## Unit tests\n\nIn order to run unit tests:\n```\npython -m unittest -v\n```\n\n## Test Coverage\n\nIn order to view the current test coverage metrics:\n```\ncoverage run --source pvsystemprofiler -m unittest discover && coverage html\nopen htmlcov/index.html\n```\n\n## Versioning\n\nWe use [Semantic Versioning](http://semver.org/) for versioning. For the versions available, see the [tags on this repository](https://github.com/slacgismo/pv-system-profiler/tags).\n\n## License\n\nThis project is licensed under the BSD 2-Clause License - see the [LICENSE](LICENSE) file for details\n'",",https://zenodo.org/badge/latestdoi/183074637","2019/04/23, 18:33:36",1646,BSD-2-Clause,0,778,"2021/10/13, 19:42:00",4,28,29,0,742,1,0.1,0.19581749049429653,"2021/10/13, 19:42:35",v0.1.4,0,4,false,,false,true,"MichaelHopwood/ForwardForwardOneclass,ChrisKre/Photovoltaik_GAN,slacgismo/solar-data-tools",,https://github.com/slacgismo,https://gismo.slac.stanford.edu/,"SLAC National Accelerator Laboratory, Menlo Park, CA 94025",,,https://avatars.githubusercontent.com/u/19895500?v=4,,,
"A Global Inventory of Commercial-, Industrial-, and Utility-Scale Photovoltaic Solar Generating Units",Used to produce a global inventory of utility-scale solar photovoltaic generating station.,Lkruitwagen,https://github.com/Lkruitwagen/solar-pv-global-inventory.git,github,,Photovoltaics and Solar Energy,"2021/11/25, 08:55:36",122,0,20,false,Jupyter Notebook,,,"Jupyter Notebook,Python",,"b'# A Global Inventory of Commerical-, Industrial-, and Utility-Scale Photovoltaic Solar Generating Units\nRepository for machine learning and remote sensing pipeline described in [Kruitwagen, L., Story, K., Friedrich, J., Byers, L., Skillman, S., & Hepburn, C. (2021) A global inventory of photovoltaic solar energy generating units, _Nature_ **598**, 604\xe2\x80\x93610](https://www.nature.com/articles/s41586-021-03957-7). \n\n# Project Summary\n\n## Abstract\n\nPhotovoltaic (PV) solar energy generating capacity has grown by 41\\% per year since 2009. This rapid deployment of solar energy must continue if climate and Sustainable Development Goals are to be met. Energy system projections that mitigate climate change and facilitate universal energy access show a nearly ten-fold increase in PV solar energy generating capacity by 2040. Geospatial data describing the energy system is required to manage generation intermittency, mitigate climate change risks, and identify trade-offs with biodiversity, conservation, and land protection priorities caused by the land use and land cover change necessary for PV deployment. Currently available inventories of solar generating capacity cannot fully address these needs. Here, we provide a global inventory of commercial-, industrial-, and utility-scale PV solar energy generation stations (i.e. PV generating stations in excess of 10kW nameplate capacity) using a longitudinal corpus of remote sensing imagery, machine learning, and a large cloud computation infrastructure. We locate and verify 68,661 facilities, an increase of 253\\% (in number of facilities) on the previously best-available asset-level data. With the help of a hand-labelled test set, we estimate global installed generating capacity to be 423GW [-75GW, +77GW] at the end of 2018. Enrichment of our dataset with estimates of facility installation date, historic land cover classification, and proximity to protected areas and indigenous and community lands allows us to show that the majority of the PV solar energy facilities are sited on cropland, followed by aridlands and grassland. Our inventory can aid PV delivery aligned with the Sustainable Development Goals.\n\n## Figure Highlights\n\n### Computer Vision with Remote Sensing Imagery to Detect Solar PV\n\n**Figure 1:** We detect utility-scale (>10kW) solar PV facilities with machine learning in Sentinel-2 and SPOT6/7 remote sensing imagery. Here, we show out-of-training-sample examples showing SPOT6/7 and Sentinel-2 optical imagery, primary inference from U-Net[2](https://arxiv.org/abs/1505.04597) computer vision models, and vectorised polygon outputs. Our models are robust to a variety of geometries and orientations, land covers, seasons, and atmospheric conditions.\n\n\n\n\n### A Machine Learning Pipeline for Global Deployment\n\n**Figure A1:** Our machine learning pipeline diagram. The pipeline was split into two branches, one for each satellite constellation, and two steps: global search to minimise false negatives, and filtering to eliminate false positives. The pipeline was deployed on 72.1mn km2, approximately half of the Earth\'s land surface area, based on population density. Additional machine learning models were used to filter the dataset for false positives. The remaining detections were verified by hand to ensure a high-quality dataset. Installation date for each solar PV facility was inferred heuristically from the detection timeseries. \n\n\n\n### A Global Dataset\n\n**Figure 2:** We deploy our pipeline on imagery captured until 2018-12-31, providing a snapshot of the state of utility-scale solar PV diffusion at the end of 2018. We visualise our dataset and observe the emegence of hotspots in space and time. We use global data for incident irradiation and solar PV productivity to estimate facility-level AC generation capacity. Over our 30-month study window, we observe an increase of 81\\% in deployed generating capacity, led by increases in China (120\\%), India (184\\%), the EU-27+GB (20\\%), the United States (58\\%), and Japan (119\\%). \n\n\n\n### Novel Land-Cover Analysis\n\n**Figure 3:** To demonstrate the utility of our asset-level dataset, we prepare an analysis of pre-installation landcover for utility-scale solar PV. The land chosen for the development of solar PV has impacts on and trade-offs with the costs of the solar PV system, greenhouse gas emissions net of land cover change, ecosystem health, water resources and good production, land and property values, and political acceptability, and so is an urgent priority for study. We find no consistent trend in land cover chosen for solar PV development over the study period (panel b), and we observe that the areas chosen for PV deployment skew heavily towards areas with excessive cropland (panel e). However, within these areas, deployment skews to barren and grasslands (panel d). Installation size skews larger for barren land covers (i.e. solar PV mega-projects) and smaller for developed areas (i.e. rooftop commercial and industrial installations)(panel c).\n\n\n\n### Detailed Country-Level Insight\n\n**Figure A10:** We provide analysis at the country-level for the top 20 countries in our dataset. PV installations in most countries displace cropland. China, Chile, India, and South Africa have barren-land PV megaprojects. Among European Economic Area countries, France and Germany are unique for showing a local skew _towards_ developed areas, while all others show a reinforced skew towards croplands. Development in most countries appears to disfavour sites with pre-existing forests, with the exception of South Africa.\n\n\n\n## Dataset Availability\n\nRecognising the fundamental public-goods nature of asset-level data and its importance in the urgent mitigation of climate change, we make our dataset publicly available.\n\nThe complete dataset can be downloaded from the [Zenodo data repository](https://zenodo.org/record/5005868).\n\nAn interactive visualisation of our dataset is from the World Resources Institute [here](https://resourcewatch.org/data/explore/ene032-Solar-Plants_1).\n\n## Acknowledgements\n\nThe authors acknowledge the generous contribution of [Descartes Labs, Inc.](https://www.descarteslabs.com/) which provided the authors with API credentials for easy imagery access and manipulation, and a cloud computation platform for imagery analysis. Descartes Labs is spin-out company from Los Alamos National Laboratory that provides a data refinery for satellite imagery. The authors also acknowledge the generous support of the [World Resource Institute](https://www.wri.org/) who provided insight and data resources to the project. [Wiki-Solar](https://wiki-solar.org/) also provided valuable insight and data. The Sentinel-2 semantic segmentation model was trained on Amazon Web Services with a supporting grant. The hand-verification of Sentinel-2 detections was supported by Microsoft Azure cloud computing services with credits provided by the AIforEarth program.\n\n\n# Repository\n\n## Setup\n\n### Virtual Environment\n\nWe recommend using Conda for package and environment management. Create a new conda environment:\n\n conda create -n solar-pv python=3.6\n\n### Clone Repository\n\nClone this repository using git:\n\n git clone\n\nAdd the directory root to the Python path environment variable:\n\n export PYTHONPATH=$(pwd):$PYTHONPATH\n\n(optional) You may want to add this to a bash script for your environment:\n\n touch //path/to/conda/envs/solar-pv/etc/conda/activate.d/env_vars.sh\n nano //path/to/conda/envs/solar-pv/etc/conda/activate.d/env_vars.sh\n\nThen input:\n\n export PYTHONPATH=$(pwd):$PYTHONPATH\n\nand save and exit.\n\n### Install Packages\n\nInstall Python packages via pip:\n\n pip install -r requirements.txt\n\n### Descartes Labs\n\nDescartes Lab alpha and Airbus SPOT6/7 access is required to run this repository.\n\n### Gurobi\n\nMatching installations to existing data uses a mixed integer linear program specified using [PuLP](https://pypi.org/project/PuLP/). We use [Gurobi](https://www.gurobi.com/), a commercial solver, to solve the linear program. Gurobi requires a [license](https://www.gurobi.com/documentation/9.0/quickstart_mac/retrieving_and_setting_up_.html), it can be installed with:\n\n conda config --add channels http://conda.anaconda.org/gurobi\n conda install gurobi\n\n## Directories and Scripts\n- **solarpv\\\\**\n - **training\\\\**\n - **s2\\\\**\n - `model_resunet.json`: ResUNet specification\n - `S2_training_data.py`: Training data generator from cloud resource to disk\n - `training_data_mp.py`: Multithreaded training data generator from cloud resource to disk\n - `train_S2_RNN1.py`: Training for S2 Branch RNN-1\n - `train_S2_RNN1.py`: Training for S2 Branch RNN-2\n - `train_S2_unet.py`: Training for S2 Branch UNet with generator from dist\n - **spot\\\\**\n - `generator.py`: SPOT UNet training generator\n - `optimizer.py`: SPOT UNet training optimizer\n - `train.py`: SPOT UNet training entry point\n - `train_classifier.py`: SPOT classifier training entry point\n - `train_solar_unet.ipynb`: SPOT UNet training prototyping\n - `transforms.py`: SPOT UNet training transforms\n - `unet.py`: SPOT UNet model generator\n - **deployment\\\\**\n - `cloud_dl_functions.py`: Pipeline functions for deployment on DL tasks queuing\n - `create_cloud_functions.py`: Scripts for deploying DL cloud functions\n - `create_cloud_products.py`: Scripts for creating DL cloud products for reading/writing by cloud functions\n - `pipeline.py`: Entrypoint for running geographies through the deployment pipeline\n - `store_model.py`: Scripts to push inference models to cloud storage\n - **analysis\\\\**\n - **quality\\\\**\n - `deploy_precisions.ipynb`: Notebook for obtaining precision of S2 and SPOT branch deployment\n - `precision-recall.ipynb`: Notebook for obtaining cross-validation precision, recall, and intersection-over-union for all pipeline stages\n - `S2_band_dropout.py`: Band dropout analysis for the primary S2 inference model\n - `SPOT_band_dropout.ipynb`: Band dropout analysis for the primary SPOT inference model\n - **matching\\\\**\n - `match_region.py`: Mixed-integer linear programming (MILP) matching script for our dataset with other asset-level data\n - `match_postprocess.ipynb`: Notebook to extract insight after matching between our dataset and other available datasets\n - `MILP_WRI-matching_stripped.ipynb`: Prototyping for MILP matching\n - `vincenty.py`: Vincenty geodescic distance scripts\n - **landcover\\\\**\n - `add_land_cover.py`: Multithreaded script for adding land cover to PV detections\n - `cloud_land_cover.ipynb`: Notebook for cloud-based addition of land cover to PV detections\n - `land_cover_skew_analysis.ipynb`: Notebook for analysis of land cover in our PV detections\n - `land_cover_skew_deploy.ipynb`: Notebook for cloud-based reduction of land cover across large geographies\n - **generating_capacity\\\\**\n - `MW_capacity.ipynb`: Add generating capacity to PV detections\n - `utils.py`: Shared utilities\n- **makefigs\\\\**\n - `fig-1_prediction_map.py`: Script to generate Figure 1\n - `fig-2_results_map.py`: Script to generate Figure 2\n - `fig-3_fig-A10_maplandcover.py`: Script to generate Figure 3 and Figure A10\n - `fig-A2_area_dist.py`: Script to generate Figure A2\n - `fig-A3_deployment_area.py`: Script to generate Figure A3\n - `fig-A6_PR_summary.py`: Script to generate Figure A6\n - `fig-A7_deploy_precision.py`: Script to generate Figure A7\n - `fig-A8_band_perturbation.py`: Script to generate Figure A8\n - `fig-A9_install_date_US.py`: Script to generate Figure A9\n - **figures\\\\**\n - [All figures used in the preparation of the paper]\n- **data\\\\**\n - [All data used in the training, deployment, and analysis workflows]\n- **bin\\\\**\n - `CORINE2DL.ipynb`: Notebook for uploading Copernicus CORINE land cover products to DL product\n - `genyaml_cloudfunctions.py`: Script to generate DL cloud function YAML \n - `genyaml_cloudproducts.py`: Script to generate DL cloud product YAML\n - `make_S2_cv_samples.py`: Script to generate samples for S2 Branch cross-validation\n - `MODIS2DL.ipynb`: Notebook for uploading MODIS land cover products to DL product\n - `color_gdf.ipynb`: Notebook to add color to PV detections for Earth Engine visualisation\n\n\n## Data\n\nAll data, including training, validation, test, and predicted datasets, is available to download from the [Zenodo repository](https://zenodo.org/record/5005868). An interactive visualisation is also offered by the [World Resources Institute ResourceWatch](https://resourcewatch.org/data/explore/ene032-Solar-Plants).\n\n## Workflows\n\n### Training\n\n#### Sentinel-2 Training\n\n1. Sentinel-2 training samples can be generated using the multithreaded generator: \n```python\npython solarpv/training/s2/training_data_mp.py\n```\n\n2. Train the primary inference UNet model, RNN-1, and RNN-2\n```python\npython solarpv/training/s2/train_S2_unet.py\npython solarpv/training/s2/train_S2_RNN1.py\npython solarpv/training/s2/train_S2_RNN2.py\n```\n\n#### SPOT Training\n\n1. SPOT training samples use proprietary Airbus SPOT6/7 imagery and so must be retained on an infrastructure licensed for SPOT6/7 imagery. Training the SPOT branch requires access to the DescartesLabs platform. Contact the authors for details.\n2. Obtain the UNet training imagery and move these images into `data/SPOT_train/ground/` and obtain or develop `train_keys.txt` and `val_keys.txt`.\n3. Train the SPOT UNet model, entering the SPOT training scripts with `solarpv/training/spot/train.py`:\n```python\npython solarpv/training/spot/train.py --train\n```\n4. Obtain the classifier training imagery and move these images into `data/SPOT_train/classifier_ground/` and sort them into `train` and `val`, and `neg` and `pos` within each.\n5. Train the SPOT classifier model:\n```python\npython solarpv/training/spot/train_classifier.py --train\n```\n\n### Deployment\n\n1. Deployment makes use of the DescartesLabs platform. DescartesLabs _alpha_ access is required to successfully deploy the machine learning model. Contact the authors for details.\n2. Generate the YAML which tracks the cloud functions and cloud products used in the pipeline deployment.\n```python\npython bin/genyaml_cloudfunctions.py\npython bin/genyaml_cloudproducts.py\n```\n3. Generate the cloud product and the cloud functions.\n```python\npython solarpv/deployment/create_cloud_products.py\npython soalrpv/deployment/create_cloud_functions.py\n```\n4. Store the machine learning models\n```python\npython solarpv/deployment/store_model.py --model_path=""./s2_rnn1.h5""\npython solarpv/deployment/store_model.py --model_path=""./s2_rnn2.h5""\npython solarpv/deployment/store_model.py --model_path=""./s2_unet.h5""\npython solarpv/deployment/store_model.py --model_path=""./solar_pv_airbus_spot_rgbn_v5.h5""\n```\n5. Deploy geographies deploy pipeline stages to selected geographies. Pipeline stages must be one of `[S2Infer1, S2RNN1, S2Infer2, SPOTVectoriser]`, geographies must be an iso-3166-1 two-letter code. For example, to deploy the primary inference stage for the United Kingdom:\n```python\npython solarpv/deployment/pipeline.py --model_path=""S2Infer1"" --geography=""GB""\n```\n\n### Analysis\n\n#### Quality\n\n1. Run the cross-validation precision-recall notebook `solarpv/analysis/quality/precision-recall.ipynb`\n2. Run the deployment precision nodebook `solarpv/analysis/quality/deploy_precision.ipynb`\n3. Run the S2 band dropout script `solarpv/analysis/quality/S2_band_dropout.py`\n4. Run the SPOT band dropout `solarpv/analysis/quality/SPOT_band_dropout.ipynb`\n\n#### Generating Capacity\n\n1. Run the generation capacity notebook `solarpv/analysis/generating_capacity/MW_capacity.ipynb`\n\n#### Landcover\n\n1. Run the cloud land cover notebook `solarpv/analysis/landcover/cloud_land_cover.ipynb`\n2. Run the cloud land cover skew notebook `solarpv/analysis/landcover/land_cover_skew_deploy.ipynb`\n3. Run the land cover skew analysis notebook `solarpv/analysis/landcover/land_cover_skew_analysis.ipynb`\n\n#### Matching\n\n1. Match geographies of interest with the WRI Global Power Plant Database and the EIA power plant dataset using `solarpv/analysis/matching/match_region.py`:\n```python\npython solarpv/analysis/matching/match_region.py --dataset=""wri"" --geography=""GB""\n```\n2. Postprocess the matches with `solarpv/analysis/matching/match_postprocess.ipynb`\n'",",https://arxiv.org/abs/1505.04597,https://zenodo.org/record/5005868,https://zenodo.org/record/5005868","2019/11/24, 22:39:07",1430,MIT,0,111,"2022/07/13, 12:53:29",1,7,11,0,469,0,0.0,0.03157894736842104,"2021/06/30, 08:43:52",v1.0.0,0,2,false,,false,true,,,,,,,,,,,
dGen,"Forecast PV adoption based on user specified configurations like electricity rate prices, electricity load growth, solar resource factors, and much more.",NREL,https://github.com/NREL/dgen.git,github,,Photovoltaics and Solar Energy,"2023/10/03, 17:09:39",52,0,13,true,Python,National Renewable Energy Laboratory,NREL,"Python,Jupyter Notebook",,"b'\xef\xbb\xbf\n
\n
\n\nThe Distributed Generation Market Demand (dGen) Model\n=====================================================\n\n\n \n
\n \n \n
\n \n
\n\n\n## Documentation\n- [Webinar and Setup Tutorial](https://youtu.be/-Te5_KKZR8o)\n- [Official dGen Documentation](https://nrel.github.io/dgen/) \n- [Wiki](https://github.com/NREL/dgen/wiki)\n\nNote, after September 30th 2021 the model will be updated to version 2.0.0 and use parquet, rather than pickle (.pkl) formatted agent files. The agent data will be unchanged and the new parquet agent files can be found in [OEDI](https://data.openei.org/submissions/1931). If you wish to continue using version 1.0.0 with the pickle formatted agent files then you can find these agent files [here](https://data.nrel.gov/submissions/169).\n\n## Get Your Tools\nInstall Docker for [(Mac)](https://docs.docker.com/docker-for-mac/install/) or [(Windows)](https://docs.docker.com/docker-for-windows/install/)\n\n- Important: In Docker, go into Docker > Preferences > Resources and up the allocation for disk size image for Docker. 16 GB is recommended for smaller (state level) databasese. 32 GB is recommended for ISO specific databases. 70+GB is required for restoring the national level database. If you get a memory issue then you\'ll need to up the memory allocation and or will need to prune past failed images/volumes. Running the below docker commands will clear these out and let you start fresh:\n```\n $ docker system prune -a \n $ docker volume prune -f\n``` \n- Please refer to Docker\xe2\x80\x99s [documentation](https://docs.docker.com/reference/) for more details.\n\n- Install [Anaconda for Python 3.7](https://www.anaconda.com/distribution/). Users with VPNs may need to turn their VPNs off while installing or updating Anaconda.\n\n- Install [PgAdmin](https://www.pgadmin.org/download/). Ignore all of the options for docker, python, os host, etc.\n\n- Install Git: If you don\'t already have git installed, then navigate [here](https://www.atlassian.com/git/tutorials/install-git) to install it for your operating system.\n\nWindows users: \n- We recommend using Powershell.\n- If you don\'t have UNIX commands enabled for command prompt/powershell then you\'ll need to install Cygwin or QEMU to run a UNIX terminal.\n\n## Download Code \nUsers need to fork a copy of the dGen repo to their own private github account. \n\nNext, clone the forked repository to your local machine by running the following in a terminal/powershell/command prompt:\n```$ git clone https://github.com//dgen.git```\n\n\n# Running and Configuring dGen\n\n### A. Create Environment\nAfter cloning this repository and installing (and running) Docker as well as Anaconda, we\'ll create our environment and container:\n\n1. Depending on directory you cloned this repo into, navigate in terminal to the python directory (/../dgen/python) and run the following command:\n\n```$ conda env create -f dg3n.yml```\n\n- This will create the conda environment needed to run the dgen model.\n- The dgen model is optimized for Python v3 and above. Run ```$ conda list ``` to verify you have this version.\n\n2. This command will create a container with PostgreSQL initialized.\n\n```$ docker run --name postgis_1 -p 5432:5432 -e POSTGRES_USER=postgres -e POSTGRES_PASSWORD=postgres -d mdillon/postgis```\n\n- Alternatively, if having issues connecting to the postgres server in pgAdmin, run:\n\n```$ docker run --name postgis_1 -p 5432 -e POSTGRES_USER=postgres -e POSTGRES_PASSWORD=postgres -d mdillon/postgis```\n\n- This will allow the docker container to select a different port to forward to 5432.\n\nTo setup another docker container with a different database you can run:\n```$ docker run --name postgis_2 -p 7000:5432 -e POSTGRES_USER=postgres -e POSTGRES_PASSWORD=postgres -d mdillon/postgis``` where ```7000``` can be any port number not already in use. \n\n\n3. Connect to our postgresql DB. In the command line run the following:\n\n```\n $ docker exec -it psql -U postgres\n $ postgres=# CREATE DATABASE dgen_db;\n```\n\nNotes:\n- Use the alpha-numeric container id rather than the container name.\n- The container id can be gotten by running ```$ docker container ls```. If this doesn\'t display anything try running ```$ docker container ps```.\n- If you get the error ``` psql: FATAL: the database system is starting up ``` try rerunning the docker exec command again after a minute or so because docker can take some time to initialize everything.\n- ```CREATE DATABASE``` will be printed when the database is created. ```\\l``` will display the databases in your server.\n\n\n### B. Download data (agents and database):\nDownload data by navigating to https://data.openei.org/submissions/1931 and clicking the \'model inputs\' tab. Make sure to unzip any zipped files once downloaded. Note, the 13.5 GB dgen_db.sql.zip file contains all of the data for national level runs. We recommend starting with the database specific to the state or ISO region you\'re interested in. \n\nFor example, if you want to simulate only California then navigate to the \'ca_final_db\' folder and download the dgen_db.sql file. \n\nYou will also need to download and unzip the agent files ""OS_dGen_Agents.zip"", making sure the use the correct agent file corresponding to the scenario you\'d like to run (e.g. commercial agents for California).\n\n#### Windows Users\n\nWe recommend using Powershell. If you don\'t have UNIX commands enabled for command prompt/powershell then you\'ll need to install Cygwin or QEMU to run a UNIX terminal.\n\nIn Powershell run the following (replace \'path_to_where_you_saved_database_file\' below with the actual path where you saved your database file):\n\n```\n $ docker cp /path_to_where_you_saved_data/dgen_db.sql :/dgen_db.sql\n $ docker exec -i psql -U postgres -d dgen_db -f dgen_db.sql\n```\n\n\n#### Mac Users\n\nIn a new terminal widnow run the following (make sure to replace \'path_to_where_you_saved_database_file\' below with the actual path where you saved your database file): \n\n```$ cat /path_to_where_you_saved_data/dgen_db.sql | docker exec -i psql -U postgres -d dgen_db```\n\nNotes:\n- Backing up state/ISO databases will likely take 5-15 minutes. The national database will likely take 45-60 minutes.\n- Don\'t close docker at any point while running dGen.\n- The container can be ""paused"" by running ```$ docker stop ``` and ""started"" by running ```$ docker start ```\n- The container must be started/running to restore and or access the database (including during model run time).\n\nTroublshooting Container/Database Issues:\n- Make sure the disk size for Docker has been properly allocated (make sure at least 16GB has been allocated for state level databases, at least 32 GB for ISO level databases, and at least 70 GB for the national database). You\'ll need to restart docker after changing the disk size in Docker\'s system preferences and will need to make a new container/start from scratch.\n- If making a new container first run ```docker system prune -a``` and ```docker volume prune -f```.\n- Make sure you\'ve specificed the right path to the .sql file and make sure the .sql file is unzipped.\n- Make sure the use the container\'s alpha-numeric ID rather than the container name. \n- If on a VPN try turning the VPN off when making the container and restoring the database.\n- Try googling errors.\n\n### C. Create Local Server:\nOnce the database is restored (it will take some time), open PgAdmin and create a new server. Name this whatever you want. Input ""localhost"" (or 127.0.0.1) in the host/address cell and ""postgres"" in both the username and password cells. Upon refreshing this and opening the database dropdown, you should be able to see your database. \n\n### D: Activate Environment \nActivate the dg3n environment and launch spyder by opening a new terminal window and run the following command:\n\n```\n $ conda activate dg3n\n $ (dg3n) spyder\n```\n\n- In spyder, open the ```dgen_model.py``` file. This is what we will run once everything is configured.\n\nNotes:\n- Sometimes Spyder can have issues accessing files. It may be helpful to set the working directory by right clicking the white folder icon in the upper righthand corner and navigating to ```/path_to_where_you_cloned_dgen/dgen_os```.\n- Spyder\'s kernel can sometimes have issues/stop unexpectedly. Refreshing the kernel might help if you\'re encountering issues running dgen_model.py.\n- Spyder isn\'t necessary to use. If you\'d rather run dGen by launching python from the dg3n environment then by all means do so.\n\n### E: Configure Scenario\n1. Open the blank input sheet located in ```dgen_os/excel/input_sheet_v_beta.xlsm ``` (don\'t forget to enable macros!). This file defines most of the settings for a scenario. Configure it depending on the desired model run and save a copy in the input_scenarios folder, i.e. ```dgen_os/input_scenarios/my_scenario.xlsm```. \n\nSee the Input Sheet [Wiki page](https://github.com/NREL/dgen/wiki) for more details on customizing scenarios. \n\n\n2. In the python folder, open ```pg_params_connect.json``` and configure it to your local database. If you didn\'t change your username or password settings while setting up the docker container, this file should look like the below example:\n\n```\n {\t\n\t""dbname"": """",\n \t""host"": ""localhost"",\n\t""port"": ""5432"",\n\t""user"": ""postgres"",\n\t""password"": ""postgres""\n }\n```\n\n- dbname will likely just be ""dgen_db"" unless you changed the name of this database in postgres\n- Localhost could also be set as ""127.0.0.1""\n- Save this file\n- Make sure the role is set as ""postgres"" in ```settings.py``` (it is set as ""postgres"" already by default)\n\nThe cloned repository will have already initialized the default values for the following important parameters:\n\n* ``` start_year = 2014 ``` ( in /../dgen/python/config.py) --> start year the model will begin at\n* ``` pg_procs = 2 ``` ( in /../dgen/python/config.py) --> number of parallel processes the model will run with\n* ``` cores = 2 ``` ( in /../dgen/python/config.py) --> number of cores the model will run with\n* ``` role = ""postgres"" ``` ( in /../dgen/python/config.py) --> set role of the restored database\n\n\n### F: Run the Model\n\nOpen ```dgen_model.py``` in the Spyder IDE and hit the large green arrow ""play button"" near the upper left to run the model.\n\nOr, launch python from within the dg3n environment and run:\n```$ python dgen_model.py```\n\nNotes:\n- Only one agent file can be put in the input_agents directory.\n- Results from the model run will be placed in a SQL table called ""agent_outputs"" within a newly created schema in the connected database. \n- The database and results will be preserved in the docker container if you stop the container and or close docker. Simply start the container to access the database again.\n- The database will not persist once a docker container is terminated. Results will need to be saved locally by downloading the agent_outputs table from the schema run of interest or by dumping the entire database to a .sql file (see below).\n\n## Saving Results:\n1. To backup the whole database, including the results from the completed run, please run the following command in terminal after changing the save path and database name:\n\n```$ docker exec pg_dumpall -U postgres > /../path_to_save_directory/dgen_db.sql```\n\n- this .sql file can be restored in the same way as was detailed above. \n\n2. To export just the ""agent_outputs"" table, simply right click on this table and select the ""Import/Export"" option and configure how you want the data to be saved. Note, if a save directory isn\'t specified this will likely save in the home directory.\n\n\n## Notes:\n- The ""load_path"" variable in config.py from the beta release has been removed for the final release. The load data is now integrated into each database. Load data and meta data for the agents is still accessible via the OEDI data submission.\n'",,"2020/04/15, 15:51:02",1288,BSD-3-Clause,7,88,"2023/06/23, 18:52:15",8,7,24,5,124,2,0.0,0.5909090909090908,"2021/03/26, 20:34:39",1.0.0,0,5,false,,false,false,,,https://github.com/NREL,http://www.nrel.gov,"Golden, CO",,,https://avatars.githubusercontent.com/u/1906800?v=4,,,
SOLECTRUS,An alternative photovoltaic dashboard that visualizes the yield and consumption.,solectrus,https://github.com/solectrus/solectrus.git,github,"senec,photovoltaic,photovoltaics,photovoltaics-dashboard,influxdb",Photovoltaics and Solar Energy,"2023/10/25, 17:36:43",72,0,40,true,Ruby,SOLECTRUS,solectrus,"Ruby,Slim,TypeScript,JavaScript,Shell,CSS,Dockerfile,Procfile",https://solectrus.de,"b'[](https://github.com/solectrus/solectrus/actions)\n[](https://codeclimate.com/repos/5fe98897e985f4018b001e7d/maintainability)\n[](https://codeclimate.com/repos/5fe98897e985f4018b001e7d/test_coverage)\n[](https://wakatime.com/badge/user/697af4f5-617a-446d-ba58-407e7f3e0243/project/ce8d6e54-7457-42e5-94a3-33a9d4021d45)\n\n# SOLECTRUS\n\nPhotovoltaic Dashboard, read here about the motivation (in German):\nhttps://ledermann.dev/blog/2021/02/03/photovoltaik-dashboard-als-web-applikation/\n\n\n\n## Installation\n\nFor self-hosting SOLECTRUS, please look at https://github.com/solectrus/hosting\n\n## Development\n\n1. Clone the repo locally:\n\n```bash\ngit clone git@github.com:solectrus/solectrus.git\ncd solectrus\n```\n\n2. Install PostgreSQL, Redis, and puma-dev (if not already present). On a Mac with HomeBrew, run this to install from the `Brewfile`:\n\n```bash\nbrew bundle\n```\n\n3. Install and set up [puma-dev](https://github.com/puma/puma-dev) to use HTTPS for development. Do this on macOS:\n\n```bash\nsudo puma-dev -setup\npuma-dev -install\npuma-dev link\n\n# Use Vite via puma-dev proxy\n# Adopted from https://github.com/puma/puma-dev#webpack-dev-server\necho 3036 > ~/.puma-dev/vite.solectrus\n```\n\n4. Setup the application to install gems and NPM packages and create the database:\n\n```bash\nbin/setup\n```\n\n5. Start the application locally:\n\n```bash\nbin/dev\n```\n\nThen open https://solectrus.test in your browser.\n\n## Test\n\nAfter preparing development environment (see above):\n\n```bash\nbin/influxdb-restart.sh\nDISABLE_SPRING=1 bin/rspec\nDISABLE_SPRING=1 RAILS_ENV=test bin/rake cypress:run\nopen coverage/index.html\n```\n\nRuboCop:\n\n```\nbin/rubocop\n```\n\nESLint:\n\n```\nbin/yarn lint\n```\n\nTypeScript:\n\n```\nbin/yarn tsc\n```\n\nThere is a shortcut to run **all** test and linting tools:\n\n```bash\nbin/test\n```\n\n## License\n\nCopyright (c) 2020-2023 Georg Ledermann, released under the AGPL-3.0 License\n'",,"2020/12/19, 18:28:37",1040,AGPL-3.0,1606,3682,"2023/10/25, 17:36:44",41,2278,2380,979,0,0,0.0,0.5207756232686981,"2023/10/01, 13:36:19",v0.13.2,0,4,false,,false,false,,,https://github.com/solectrus,https://solectrus.de,"Jülich, Germany",,,https://avatars.githubusercontent.com/u/76243773?v=4,,,
pvdeg,Set of tools to calculate degradation responses and degradation related parameters for PV.,NREL,https://github.com/NREL/PVDegradationTools.git,github,"degradation,photovoltaic-systems,python,reliability,duramat,pv-modules",Photovoltaics and Solar Energy,"2023/09/14, 18:01:59",14,0,10,true,Jupyter Notebook,National Renewable Energy Laboratory,NREL,"Jupyter Notebook,Python,Dockerfile,TeX",https://pvdegradationtools.readthedocs.io/,"b'
\n\n\n\n | License | \n \n \n \n \n | \n
\n\n | Publications | \n \n \n | \n
\n\n | Documentation | \n \n\t\n\t \n\t\n | \n
\n
\n\n\n\n# PV Degradation Tools (pvdeg)\n\nThis repository contains functions for calculating degradation of photovoltaic modules. For example, functions to calculate front and rear relative Humidity, as well as Acceleration Factors. A degradation calculation function is also being developed, considering humidity and spectral irradiances models.\n\n\nTutorials\n=========\n\n### Jupyter Book\n\nFor in depth Tutorials you can run online, see our [jupyter-book](https://nrel.github.io/PVDegradationTools/intro.html) [](https://nrel.github.io/PVDegradationTools/intro.html)\n\nClicking on the rocket-icon on the top allows you to launch the journals on [Google Colaboratory](https://colab.research.google.com/) for interactive mode.\nJust uncomment the first line `pip install ...` to install the environment on each journal if you follow this mode.\n\n### Binder\n\nTo run these tutorials in Binder, you can click here:\n[](https://mybinder.org/v2/gh/NREL/PVDegradationTools/main)\nIt takes a minute to load the environment.\n\n### Locally\n\nYou can also run the tutorial locally with\n[miniconda](https://docs.conda.io/en/latest/miniconda.html) by following thes\nsteps:\n\n1. Install [miniconda](https://docs.conda.io/en/latest/miniconda.html).\n\n1. Clone the repository:\n\n ```\n git clone https://github.com/NREL/PVDegradationTools.git\n ```\n\n1. Create the environment and install the requirements. The repository includes\n a `requirements.txt` file that contains a list the packages needed to run\n this tutorial. To install them using conda run:\n\n ```\n conda create -n pvdeg jupyter -c pvlib --file requirements.txt\n conda activate pvdeg\n ```\n\n or you can install it with `pip install pvdeg` as explained in the installation instructions into the environment.\n\n1. Start a Jupyter session:\n\n ```\n jupyter notebook\n ```\n\n1. Use the file explorer in Jupyter lab to browse to `tutorials`\n and start the first Tutorial.\n\n\nDocumentation\n=============\n\nDocumentation is available in [ReadTheDocs](https://PVDegradationTools.readthedocs.io) where you can find more details on the API functions.\n\n\nInstallation\n============\n\nRelative Humidity and Acceleration Factors for Solar Modules releases may be installed using the ``pip`` and ``conda`` tools. Compatible with Python 3.5 and above.\n\nInstall with:\n\n pip install pvdeg\n\nFor developer installation, download the repository, navigate to the folder location and install as:\n\n pip install -e .\n\n\nLicense\n=======\n\n[BSD 3-clause](https://github.com/NREL/PVDegradationTools/blob/main/LICENSE.md)\n\n\nContributing\n=======\n\nWe welcome contributiosn to this software, but please read the copyright license agreement (cla-1.0.md), with instructions on signing it in sign-CLA.md. For questions, email us.\n\n\nGetting support\n===============\n\nIf you suspect that you may have discovered a bug or if you\'d like to\nchange something about pvdeg, then please make an issue on our\n[GitHub issues page](hhttps://github.com/NREL/PVDegradationTools/issues).\n\n\nCiting\n======\n\nIf you use this functions in a published work, please cite:\n\n\tHolsapple, Derek, Ayala Pelaez, Silvana, Kempe, Michael. ""PV Degradation Tools"", NREL Github 2020, Software Record SWR-20-71.\n\nAnd/or the specific release from Zenodo:\n\n\tOvaitt, Silvana, Brown, Matt, Springer, Martin, Karas, Joe, Holsapple, Derek, Kempe, Michael. (2023). NREL/PVDegradationTools: v0.1.0 official release (0.1.0). Zenodo. https://doi.org/10.5281/zenodo.8088403\n'",",https://doi.org/10.5281/zenodo.8088578,https://doi.org/10.5281/zenodo.8088403\n","2020/06/03, 20:26:52",1238,CUSTOM,436,466,"2023/10/20, 21:34:18",2,25,25,24,4,0,0.1,0.650137741046832,"2023/09/13, 18:11:05",0.1.3,0,8,false,,false,false,,,https://github.com/NREL,http://www.nrel.gov,"Golden, CO",,,https://avatars.githubusercontent.com/u/1906800?v=4,,,
solarthing,An application that can monitor data from a variety of solar charge controllers and inverters.,wildmountainfarms,https://github.com/wildmountainfarms/solarthing.git,github,"solar,outback-mate,renogy,renogy-rover,couchdb,raspberry-pi,solarthing,solar-energy,dashboard,modbus,pvoutput,packets,energy-monitor,crne-solar,slack,slack-bot,solcast",Photovoltaics and Solar Energy,"2023/10/23, 18:36:35",106,0,26,true,Java,Wild Mountain Farms,wildmountainfarms,"Java,TypeScript,Shell,Python,CSS,Perl,HTML,ANTLR,JavaScript,Dockerfile,C++",https://solarthing.readthedocs.io,"b'\n\n[](https://github.com/wildmountainfarms/solarthing/commits/master)\n[](https://github.com/wildmountainfarms/solarthing/stargazers)\n[](https://github.com/wildmountainfarms/solarthing/releases)\n[](https://github.com/wildmountainfarms/solarthing/releases)\n[](https://solarthing.readthedocs.io/en/latest/installation.html)\n\nStores solar data in a database to view on Android, Grafana, or PVOutput\n\n\n Supported Products •\n Documentation •\n Features •\n Supported Databases •\n Examples\n
\n\n## Supported Products\n* **Outback MATEs** (FX Inverter, MX/FM Charge Controller)\n* **Renogy Rover** (And other Renogy products) over modbus serial.\n * Includes Rover, Rover Elite, Wanderer, Adventurer, Dual Input DCDC Charger, Rover Boost and possibly others\n * Compatible with all [SRNE Solar](https://www.srnesolar.com) Charge Controllers (And rebranded products)\n * Compatible with **Zenith Grape** Solar Charge Controller, **PowMr** MPPT Charge Controller, **RICH** SOLAR MPPT, **WindyNations TrakMax** MPPT\n* **EPEver Tracer**\n * Includes the AN series and the TRIRON N series\n * Possibly includes the BN series (untested)\n* DS18B20 Temperature Sensors and PZEM-003 and PZEM-017 Shunts\n\n# Quickstart\nReady to install? Use the [Quickstart](https://solarthing.readthedocs.io/en/latest/installation.html)!\n\n\n# Features\n* Supports **multiple types of solar products**.\n* Runs reliably **24-7**. Recovers from connection errors and has verbose logging features.\n* Fully customizable through JSON (**No programming experience required**).\n* Supports CouchDB, InfluxDB, local JSON file, and PVOutput exporting.\n * Multiple databases can even be used at the same time!\n * Packets are uploaded in parallel to multiple databases at the same time\n* Can [report Raspberry Pi CPU temperature](https://solarthing.readthedocs.io/en/latest/config/rpi-cpu-temp.html).\n* Easy setup on Linux. Runs *without* root.\n\n## Supported Databases\n* CouchDB\n * Allows for [SolarThing Android](https://github.com/wildmountainfarms/solarthing-android) and [SolarThing Web](https://github.com/wildmountainfarms/solarthing-web) to function\n * Used for PVOutput data collection\n* GraphQL\n * Allows use of CouchDB SolarThing data with Grafana\n * Supplements the CouchDB database\n* InfluxDB\n * Simplest to set up with Grafana\n* [PVOutput.org](https://pvoutput.org)\n * Allows for viewing of data on [pvoutput.org](https://pvoutput.org)\n * Requires CouchDB to be set up\n * Enables usage of the [PVOutput Mycroft skill](https://github.com/wildmountainfarms/pvoutput-mycroft)\n* REST API\n * With the ""post"" database, all packets can be posted to a URL endpoint, useful for REST APIs\n\n\n### Examples\nPVOutput Wild Mountain Farms: [PVOutput System](https://pvoutput.org/intraday.jsp?sid=72206) and \n[PVOutput SolarThing Teams](https://pvoutput.org/listteam.jsp?tid=1528)\n\n---\n\nSolarThing Android: [Github](https://github.com/wildmountainfarms/solarthing-android)\n|\n[Google Play](https://play.google.com/store/apps/details?id=me.retrodaredevil.solarthing.android)\n\nSolarThing Android displays data in a persistent notification that updates at a configurable rate\n\n
\n\nYou can get data in [Grafana](https://github.com/grafana/grafana) via InfluxDB or via CouchDB+SolarThing GraphQL.\n\n[Snapshot of Wild Mountain Farms Dashboard](https://snapshot.raintank.io/dashboard/snapshot/iPsTvb6a0eOxEtvvu58dvRuJsJ38Onnp?orgId=2)\n\nGrafana is very customizable. Rearrange graphs and make it how you want!\n\n\n---\n\n## Usage at Wild Mountain Farms\nWe monitor an Outback MATE2, Renogy Rover PG 40A, EPEver Tracer2210AN (20A) using a Raspberry Pi 3.\nEach device has its own instance of SolarThing running. Each instance uploads data to CouchDB. CouchDB, Grafana,\nand SolarThing GraphQL run on a separate ""NAS"" computer. This NAS runs the automation and pvoutput programs.\nThe automation program handles the sending of Slack messages for low battery notifications.\n\n### Database Setup\n* [CouchDB setup](https://solarthing.readthedocs.io/en/latest/config/couchdb.html)
\n * Used for SolarThing Android, SolarThing Web, and SolarThing GraphQL (which gets data to Grafana)\n* [InfluxDB 2.0 setup](https://solarthing.readthedocs.io/en/latest/config/influxdb2.html)
\n * Used for direct Grafana queries\n\n#### [Developer Use](other/docs/developer_use.md)\n#### [Contributing](CONTRIBUTING.md)\n#### [Technical](other/docs/technical/technical.md)\n#### [Project Structure](other/docs/technical/project_structure.md)\n#### [History](other/docs/history.md)\n#### [Google Analytics](https://solarthing.readthedocs.io/en/latest/config/analytics.html)\n#### [Updating](https://solarthing.readthedocs.io/en/latest/updating.html)\n\n#### Configuration\nThis uses all JSON for configuring everything. The files you edit are all in one place unless you decide to move them.\n\nSee [configuration](https://solarthing.readthedocs.io/en/latest/configuration.html) to see how to set them up\n\n### Renogy Rover Monitoring Alternatives\nDon\'t like something about SolarThing? Here are some alternatives to monitor your Renogy Rover.\n* https://github.com/corbinbs/solarshed\n* https://github.com/logreposit/renogy-rover-reader-service\n* https://github.com/menloparkinnovation/renogy-rover\n* https://github.com/floreno/renogy-rover-modbus\n* https://github.com/CyberRad/CoopSolar\n* https://github.com/amigadad/SolarDataCollection\n\n### Suggestions?\nIf you have suggestions on how to improve the documentation or have a feature request, I\'d love to\nhear from you! [SolarThing Issues](https://github.com/wildmountainfarms/solarthing/issues)\n\nIf you get confused while trying to configure solarthing, that\'s probably because the documentation is\nalways a work in progress. If you find something confusing, please report it, so I can make it clearer.\n\n---\n\n[](https://github.com/retrodaredevil)\n[](#)\n[](#)\n[](https://libraries.io/github/wildmountainfarms/solarthing)\n[](#)\n'",,"2019/07/03, 09:49:26",1575,MIT,227,1145,"2023/10/23, 18:36:44",7,99,168,98,2,0,0.0,0.052631578947368474,"2023/07/02, 18:19:55",v2023.4.0,0,3,true,github,false,true,,,https://github.com/wildmountainfarms,,,,,https://avatars.githubusercontent.com/u/52333871?v=4,,,
solXpect,Android app to forecast the output of your photovoltaic system (PV) or balcony pv using data from Open-Meteo.com.,woheller69,https://github.com/woheller69/solxpect.git,github,"android,balkonpv,forecasting,powerplant,solar-energy,photovoltaic,photovoltaics,renewable-energy,sustainability",Photovoltaics and Solar Energy,"2023/10/12, 08:39:32",47,0,47,true,Java,,,"Java,HTML,CSS",,"b'Send a coffee to woheller69@t-online.de \n![]()
\n\n\n| **RadarWeather** | **Gas Prices** | **Smart Eggtimer** |\n|:---:|:---:|:---:|\n| [
](https://f-droid.org/packages/org.woheller69.weather/)| [
](https://f-droid.org/packages/org.woheller69.spritpreise/) | [
](https://f-droid.org/packages/org.woheller69.eggtimer/) |\n| **Bubble** | **hEARtest** | **GPS Cockpit** |\n| [
](https://f-droid.org/packages/org.woheller69.level/) | [
](https://f-droid.org/packages/org.woheller69.audiometry/) | [
](https://f-droid.org/packages/org.woheller69.gpscockpit/) |\n| **Audio Analyzer** | **LavSeeker** | **TimeLapseCam** |\n| [
](https://f-droid.org/packages/org.woheller69.audio_analyzer_for_android/) |[
](https://f-droid.org/packages/org.woheller69.lavatories/) | [
](https://f-droid.org/packages/org.woheller69.TimeLapseCam/) |\n| **Arity** | **omWeather** | **solXpect** |\n| [
](https://f-droid.org/packages/org.woheller69.arity/) | [
](https://f-droid.org/packages/org.woheller69.omweather/) | [
](https://f-droid.org/packages/org.woheller69.solxpect/) |\n| **gptAssist** | | |\n| [
](https://f-droid.org/packages/org.woheller69.gptassist/) | | |\n\n# solXpect\n\nsolXpect is an app that forecasts the output of your solar power plant by using direct and diffuse radiation data from Open-Meteo.com, calculating the position of the sun, and projecting the radiation on your solar panel. \nIt shows the estimated energy production for the next 16 days, with hourly values calculated for the preceding hour. As an example, if there are 150 Wh shown at 11:00 this means you can expect 150 Wh between 10:00 and 11:00 from your photovoltaic system.\nThe values starting with \xce\xa3 show the cumulated energy since midnight of the first day.\nTo use solXpect, you simply enter your latitude and longitude coordinates, as well as the azimuth and tilt of your solar panel. \nYou also enter information about the peak power, efficiency, temperature coefficient, and area of your solar panel, as well as the maximum power and efficiency of your inverter.\nAdditionally, solXpect allows you to define shading on your solar panels by specifying the minimum elevation of the sun necessary for the sun to hit the solar panels, as well as the percentage of shading for elevations below this value.\nIf you have multiple solar panels with the same latitude and longitude but pointing in different directions, you can define them as separate locations and use the \'show sum\' feature to summarize their output.\nOverall, solXpect is a powerful tool for optimizing the use of your own energy and reduce your energy costs. \n\n
\n\n[
](https://f-droid.org/de/packages/org.woheller69.solxpect/)\n\n## Parameters\n\n#### Name\nDefine a name for the location.\nIf you have several modules pointing in different directions at the same location you can activate ""showSum"" mode in settings.\nIn this case you should define your location names as \'myPV | part1\', \'myPV | part2\', etc. In \'show sum\' mode the location is then shown as \'myPV\' and \'|\' is taken as delimiter.\n\n#### Latitude [\xc2\xb0] \nEnter the north-south position of your solar power plant, ranging from -90\xc2\xb0 at the south pole to 90\xc2\xb0 at the north pole.\n\n#### Longitude [\xc2\xb0]\nEnter the east-west position of your solar power plant, with 0\xc2\xb0 defined as the prime meridian. Positive longitudes are east of the prime meridian, negative ones are west.\n\n#### Azimuth [\xc2\xb0]\nSpecify the horizontal direction of your solar power plant, with 0\xc2\xb0 corresponding to North, 90\xc2\xb0 to East, 180\xc2\xb0 to South, and 270\xc2\xb0 to West.\n\n#### Tilt [\xc2\xb0]\nSpecify the vertical direction of your solar power plant, with 0\xc2\xb0 pointing upwards towards the sky and 90\xc2\xb0 being a vertical orientation pointing towards the horizon.\n\n#### Cells peak power [W]\nEnter the maximum power your solar cells (total of all cells) can deliver. At the moment this value is only used if a value of 0 is specified for cells efficiency or cell area.\nIn this case it is assumed that the cells peak power is given at an irradiance of 1000W/sqm.\n\n#### Cells efficiency [%]\nSpecify the portion of energy in the form of sunlight that can be converted into electricity by the solar cell.\n\n#### Temperature coefficient [%/K]\nEnter the dependence of the cell power on temperature, usually in the range of -0.4%/K. Cell temperature is estimated from ambient temperature and total irradiance.\n\n#### Cell area [m2]\nEnter the size of your solar panels (total of all cells).\n\n#### Diffuse radiation efficiency [%]\nSpecify the efficiency of your solar power plant for diffuse radiation. When pointing up, it should be around 100%, but when pointing towards the horizon, it may be 50% or less, depending on the environment.\nYou probably need to optimize this parameter.\n\n#### Albedo [0..1]\nSpecify the average albedo for your environment to take reflections into account. The value ranges from 0 (all radiation is absorbed) to 1 (all radiation is reflected).\nExamples: Fresh snow: 0.8, green gras: 0.25, asphalt: 0.1\nYou probably need to optimize this parameter.\n\n#### Inverter power [W]\nSpecify the maximum power of your inverter. If it is lower than the maximum power of your panels, the output power of your system will be limited by this parameter.\n\n#### Inverter efficiency [%] \nEnter the efficiency of your inverter.\n\n#### Shading\nIn this section you can define the shading on your solar panels.\nFor each azimuth angle range, you can specify the minimum elevation of the sun that is necessary for the sun to hit the solar panels.\nFor elevations below this value you can set the percentage of shading. For example, a building will reduce radiation by 100%, a tree maybe only by 60%.\nYou can use the sun icon button in the main window to get information about the current azimuth and elevation of the sun to determine at what elevation the sun gets above buildings or trees.\n\n## License\n\nThis app is licensed under the GPLv3.\n\nThe app uses:\n- Parts from Privacy Friendly Weather (https://github.com/SecUSo/privacy-friendly-weather) which is licensed under the GPLv3\n- The weather data service is provided by [Open-Meteo](https://open-meteo.com/), under Attribution 4.0 International (CC BY 4.0)\n- Icons from [Google Material Design Icons](https://material.io/resources/icons/) licensed under Apache License Version 2.0\n- Material Components for Android (https://github.com/material-components/material-components-android) which is licensed under Apache License Version 2.0\n- Leaflet which is licensed under the very permissive 2-clause BSD License\n- WilliamChart (com.db.chart) (https://github.com/diogobernardino/williamchart) which is licensed under Apache License Version 2.0\n- Android Volley (com.android.volley) (https://github.com/google/volley) which is licensed under Apache License Version 2.0\n- AndroidX libraries (https://github.com/androidx/androidx) which is licensed under Apache License Version 2.0\n- AutoSuggestTextViewAPICall (https://github.com/Truiton/AutoSuggestTextViewAPICall) which is licensed under Apache License Version 2.0\n- Map data from OpenStreetMap, licensed under the Open Data Commons Open Database License (ODbL) by the OpenStreetMap Foundation (OSMF) (https://www.openstreetmap.org/copyright)\n- Solar positioning library (https://github.com/klausbrunner/solarpositioning) which is licensed under MIT License\n- Zip4j (https://github.com/srikanth-lingala/zip4j) which is licensed under Apache License Version 2.0\n- CompassView (https://github.com/kix2902/CompassView) which is published under Apache License 2.0\n\n## Contributing\n\nIf you find a bug, please open an issue in the Github repository, assuming one does not already exist.\n - Clearly describe the issue including steps to reproduce when it is a bug. In some cases screenshots can be supportive.\n - Make sure you mention the Android version and the device you have used when you encountered the issue.\n - Make your description as precise as possible.\n\nIf you know the solution to a bug please report it in the corresponding issue and if possible modify the code and create a pull request.\n'",,"2023/04/01, 13:28:22",207,GPL-3.0,70,70,"2023/09/15, 10:47:43",0,0,13,13,40,0,0,0.0,"2023/09/15, 05:41:34",V2.1,0,1,false,,false,false,,,,,,,,,,,
Solar Stations,A catalog of high-quality solar radiation monitoring stations.,AssessingSolar,https://github.com/AssessingSolar/solarstations.git,github,"solar,photovoltaics,measurement-data,open-source,solar-energy",Photovoltaics and Solar Energy,"2023/01/06, 12:03:43",9,0,3,true,,Assessing Solar,AssessingSolar,,https://SolarStations.org,b'# Solar Stations\n[](https://solarstations.org)\n\nA catalog of high-quality solar radiation monitoring stations.\n\nAn interactive map and listing of the stations can be found at [SolarStations.org](https://SolarStations.org).\n\nThe file [solarstations.csv](solarstations.csv) contains the list of stations and their metadata.\n\nPull requests with new stations or updates are highly welcome!\n',,"2021/09/23, 12:55:08",762,BSD-3-Clause,10,132,"2023/01/06, 10:32:31",34,26,41,9,292,4,0.0,0.007692307692307665,,,0,2,false,,false,false,,,https://github.com/AssessingSolar,assessingsolar.org,,,,https://avatars.githubusercontent.com/u/65510739?v=4,,,
OTSun,A python package that uses the Monte Carlo Forward Ray Tracing for the optical analysis of Solar Thermal Collectors and Solar Cells.,bielcardona,https://github.com/bielcardona/OTSun.git,github,,Photovoltaics and Solar Energy,"2023/10/08, 06:51:14",7,1,1,true,Python,,,Python,,"b'[](https://badge.fury.io/py/OTSun)\n\n\n\n# OTSun\n\nOTSun is a python package that uses the Monte Carlo Forward Ray Tracing for the optical analysis of Solar Thermal Collectors and Solar Cells. \n\n## Installation\n\nThe package can be installed either from pypi:\n`pip install otsun` or downloaded installed with `python setup.py install`.\n\nIn order to use the package the libraries of FreeCAD (https://www.freecadweb.org/) must be available and included in your python path.\n\n## Documentation\n\nThe documentation of the module is available at http://otsun.readthedocs.io/\n\n## How to cite OTSun?\n\nIf you need to cite OTSun, please use the following reference:\n\n* Cardona G, Pujol-Nadal R (2020) OTSun, a python package for the optical analysis of solar-thermal collectors and photovoltaic cells with arbitrary geometry. PLoS ONE 15(10): e0240735. https://doi.org/10.1371/journal.pone.0240735\n'",",https://doi.org/10.1371/journal.pone.0240735\n","2017/01/13, 08:04:48",2476,MIT,23,467,"2023/09/06, 17:43:53",0,51,54,4,49,0,0.0,0.2817955112219451,,,0,3,false,,false,false,otsun-uib/OTSunWebApp,,,,,,,,,,
pvOps,Contains a series of functions to facilitate fusion of text-based data with time series production data collected at photovoltaic sites.,sandialabs,https://github.com/sandialabs/pvOps.git,github,,Photovoltaics and Solar Energy,"2023/10/24, 22:07:04",11,0,5,true,Jupyter Notebook,Sandia National Laboratories,sandialabs,"Jupyter Notebook,Python",https://pvops.readthedocs.io/en/latest/,"b'
\n\n[](https://badge.fury.io/gh/sandialabs%2FpvOps)\n[](https://github.com/sandialabs/pvOps/blob/master/LICENSE)\n[](https://github.com/sandialabs/pvOps/actions)\n[](https://zenodo.org/badge/latestdoi/289032705)\n[](https://joss.theoj.org/papers/6c3554c98b1771125613cff94241847c)\n\npvops contains a series of functions to facilitate fusion of text-based data with time series production data collected at photovoltaic sites. The package also contains example datasets and tutorials to help demonstrate how the functions can be used.\n\nInstallation\n=============\npvops can be installed using `pip`. See more information at [readthedocs](https://pvops.readthedocs.io/en/latest/).\n\nTutorials\n=========\nTo get started with pvops we recommended working with the [tutorials](https://pvops.readthedocs.io/en/latest/pages/tutorials.html)\n\n\nPackage Layout and Documentation\n==============\n\nThe package is delineated into the following directories.\n```\n\xe2\x94\x9c\xe2\x94\x80\xe2\x94\x80\xe2\x94\x80docs : Documentation directory\n|\n\xe2\x94\x9c\xe2\x94\x80\xe2\x94\x80\xe2\x94\x80tutorials : Contains tutorials of functionality\n\xe2\x94\x82 \xe2\x94\x94\xe2\x94\x80\xe2\x94\x80\xe2\x94\x80 example_data : \xe2\x94\x94\xe2\x94\x80\xe2\x94\x80\xe2\x94\x80 Example data\n|\n\xe2\x94\x94\xe2\x94\x80\xe2\x94\x80\xe2\x94\x80pvops : Source function library\n \xe2\x94\x9c\xe2\x94\x80\xe2\x94\x80\xe2\x94\x80tests : \xe2\x94\x9c\xe2\x94\x80\xe2\x94\x80\xe2\x94\x80 Library stability tests\n \xe2\x94\x9c\xe2\x94\x80\xe2\x94\x80\xe2\x94\x80text : \xe2\x94\x9c\xe2\x94\x80\xe2\x94\x80\xe2\x94\x80 Text processing functions\n \xe2\x94\x9c\xe2\x94\x80\xe2\x94\x80\xe2\x94\x80text2time : \xe2\x94\x9c\xe2\x94\x80\xe2\x94\x80\xe2\x94\x80 Text2Timeseries functions\n \xe2\x94\x9c\xe2\x94\x80\xe2\x94\x80\xe2\x94\x80timeseries : \xe2\x94\x9c\xe2\x94\x80\xe2\x94\x80\xe2\x94\x80 Timeseries functions\n \xe2\x94\x94\xe2\x94\x80\xe2\x94\x80\xe2\x94\x80iv : \xe2\x94\x94\xe2\x94\x80\xe2\x94\x80\xe2\x94\x80 Current-voltage functions\n```\n\nMore information about these modules is available at [readthedocs](https://pvops.readthedocs.io/en/latest/).\n\nCiting\n======\n\nIf using this package, please cite [our paper](https://ieeexplore.ieee.org/document/9518439) using the following:\n\n**Citation:** \n\n```\nH. Mendoza, M. Hopwood and T. Gunda, ""pvOps: Improving Operational Assessments through Data Fusion,"" 2021 IEEE 48th Photovoltaic Specialists Conference (PVSC), 2021, pp. 0112-0119, doi: 10.1109/PVSC43889.2021.9518439.\n```\n\n**BibTex:**\n\n```\n@INPROCEEDINGS{9518439,\n author={Mendoza, Hector and Hopwood, Michael and Gunda, Thushara},\n booktitle={2021 IEEE 48th Photovoltaic Specialists Conference (PVSC)}, \n title={pvOps: Improving Operational Assessments through Data Fusion}, \n year={2021},\n volume={},\n number={},\n pages={0112-0119},\n doi={10.1109/PVSC43889.2021.9518439}}\n```\n\nContributing\n============\n\nThe long-term success of pvops requires community support. Please see the [Contributing page](https://pvops.readthedocs.io/en/latest/) for more on how you can contribute.\n\n[](https://badges.pufler.dev)\n\nLogo Credit: [Daniel Rubinstein](http://www.danielrubinstein.com/)\n\nCopyright and License\n=======\n\npvops is copyright through Sandia National Laboratories. The software is distributed under the Revised BSD License. See [copyright and license](https://github.com/sandialabs/pvops/blob/master/LICENSE) for more information.\n'",",https://zenodo.org/badge/latestdoi/289032705","2020/08/20, 14:48:48",1161,CUSTOM,174,453,"2023/10/24, 22:07:05",6,42,79,49,0,1,0.0,0.5308219178082192,"2023/08/09, 14:09:35",0.2.0,0,8,false,,true,false,,,https://github.com/sandialabs,https://software.sandia.gov,United States,,,https://avatars.githubusercontent.com/u/4993680?v=4,,,
IEA-15-240-RWT,A 15 MW reference wind turbine repository developed in conjunction with IEA Wind.,IEAWindTask37,https://github.com/IEAWindTask37/IEA-15-240-RWT.git,github,,Wind Energy,"2023/10/22, 02:50:30",178,0,40,true,Roff,,IEAWindTask37,"Roff,Python,Scheme,Smalltalk,F*,Shell",,"b""[](https://zenodo.org/badge/latestdoi/213679527)\n\n# IEA-15-240-RWT v1.1\nThis repository contains the model data for the 15-MW offshore reference turbine developed within IEA Wind Task 37.\n\nThe documentation for the turbine is accessible here: https://www.nrel.gov/docs/fy20osti/75698.pdf\nand the semisubmersible floating support structure is documented here: https://www.nrel.gov/docs/fy20osti/76773.pdf\n\nData in this repository includes:\n* Documentation, including tabular data used in the figures from the technical report\n* OpenFAST aeroelastic model inputs\n* HAWC2 aeroelastic model inputs\n* WISDEM optimization files\n* Wind turbine ontology .yaml files\n* CAD modeling of turbine where available\n\n## Requirements\n\n*OpenFAST*:\n* Please check the release notes for OpenFAST version compatability. OpenFAST can be compiled [from source here](https://github.com/OpenFAST/openfast.git) or precompiled Windows binaries are [available for download](https://github.com/OpenFAST/openfast/releases/latest/download/windows_openfast_binaries.zip). More information on installing and running OpenFAST is available in [OpenFAST documention](https://openfast.readthedocs.io/en/master/).\n* NREL's Reference OpenSource Controller (ROSCO) is required. This can be compiled [from source here](https://github.com/nrel/rosco) or precompiled binaries for all platforms are [available for download](https://github.com/NREL/ROSCO/releases/). The version of the ROSCO controller can be found in the header of the [DISCON.IN](https://github.com/IEAWindTask37/IEA-15-240-RWT/blob/3a00f7f38a6373f6b026aef5878b671ca7af3605/OpenFAST/IEA-15-240-RWT-UMaineSemi/IEA-15-240-RWT-UMaineSemi_DISCON.IN#L2)\n\n*HAWC2*:\n* HAWC2 can be acquired from its [homepage](https://www.hawc2.dk/). The DTU Basic Controller can be obtained from its [repository](https://gitlab.windenergy.dtu.dk/OpenLAC/BasicDTUController).\n\n*WISDEM*:\n * WISDEM can be installed from its Github [repository](https://github.com/WISDEM/WISDEM).\n * See the [documentation](https://wisdem.readthedocs.io) for installation and usage guides.\n\n\n## Design Updates\n\nThe IEA Wind Task 37 authors endeavor to keep the model input decks current with the latest releases and API changes. Errors and other issues pointed out by the community are also addressed to the extent that available resources make that possible. See the [Release Notes](blob/master/ReleaseNotes.md) for a detailed description of changes.\n\nWe also encourage the broader wind community to submit design updates by forking the repository and letting us know of your design customatization. Community contributions that we are aware of include:\n* [Bladed model](https://github.com/IEAWindTask37/IEA-15-240-RWT/wiki/Frequently-Asked-Questions-(FAQ)#is-bladed-supported) implemented by [DNV](mailto:renewables.support@dnv.com)\n* [OrcaFlex model](https://github.com/IEAWindTask37/IEA-15-240-RWT/wiki/Frequently-Asked-Questions-(FAQ)#is-orcaflex-supported) implemented by Orcina, contact [Orcina](mailto:orcina@orcina.com)\n* [Detailed rotor redesign](https://data.bris.ac.uk/data/dataset/3jrb4mejp9vfd2qb3s7dreymr1) from University of Bristol described in a TORQUE 2022 [paper](https://iopscience.iop.org/article/10.1088/1742-6596/2265/3/032029/pdf), contact [Peter Greaves](mailto:peter.greaves@ore.catapult.org.uk)\n* [NuMAD model](https://github.com/UTDGriffithLab/UTD-IEA15MWBlade) developed at The University of Texas at Dallas (UTD) by [Alejandra S. Escalera Mendoza](mailto:ase180001@utdallas.edu) and [Prof D. Todd Griffith](mailto:tgriffith@utdallas.edu)\n* [Jacket support structure](https://github.com/mmrocze2/IEA-15-240-RWT) The DEME Group created a 3-legged jacket for a 50m water depth, contact [Maciej Mroczek](mailto:Mroczek.Maciej@deme-group.com)\n\n## Citations\n\nFor a list of academic papers that use or cite this turbine, please see [here (fixed-bottom)](https://scholar.google.com/scholar?cites=11739673662820715884&as_sdt=4005&sciodt=0,6&hl=en) and [here (floating)](https://scholar.google.com/scholar?cites=17665986740213390479&as_sdt=4005&sciodt=0,6&hl=en).\n\nIf you use this model in your research or publications, please cite the appropriate report as:\n\n @techreport{IEA15MW_ORWT,\n author = {Evan Gaertner and Jennifer Rinker and Latha Sethuraman and Frederik Zahle and Benjamin Anderson and Garrett Barter and Nikhar Abbas and Fanzhong Meng and Pietro Bortolotti and Witold Skrzypinski and George Scott and Roland Feil and Henrik Bredmose and Katherine Dykes and Matt Sheilds and Christopher Allen and Anthony Viselli},\n Howpublished = {NREL/TP-75698},\n institution = {International Energy Agency},\n title = {Definition of the {IEA} 15-Megawatt Offshore Reference Wind Turbine},\n URL = {https://www.nrel.gov/docs/fy20osti/75698.pdf},\n Year = {2020}\n }\n\n @techreport{IEA15MW_ORWT_Floating,\n author = {Christopher Allen and Anthony Viselli and Habib Dagher and Andrew Goupee and Evan Gaertner and Nikhar Abbas and Matthew Hall and Garrett Barter},\n Howpublished = {NREL/TP-76773},\n institution = {International Energy Agency},\n title = {Definition of the {UMaine} {VolturnUS-S} Reference Platform Developed for the {IEA Wind} 15-Megawatt Offshore Reference Wind Turbine}},\n URL = {https://www.nrel.gov/docs/fy20osti/76773.pdf},\n Year = {2020}\n }\n\n## Questions\n\nBefore reaching out to NREL or DTU authors with questions on the model or reports, please see our frequently asked questions (FAQ) on our [Github Wiki](https://github.com/IEAWindTask37/IEA-15-240-RWT/wiki/Frequently-Asked-Questions-(FAQ)) and current or prior [Issues](https://github.com/IEAWindTask37/IEA-15-240-RWT/issues).\n\nIf neither the FAQ or Issues address your need, please create a new Issue on this repository so that the dialogue is archived for others that might have similar questions. You can also reach out to the authors directly if that is your preference. The technical report lists the contributions of individual authors if you have a specific question. Otherwise, you can contact Garrett Barter (garrett.barter@nrel.gov).\n""",",https://zenodo.org/badge/latestdoi/213679527","2019/10/08, 15:18:15",1478,Apache-2.0,129,762,"2023/10/22, 02:50:31",24,73,145,55,3,0,0.6,0.5467479674796748,"2023/10/22, 16:32:03",v1.1.8,0,13,false,,false,false,,,https://github.com/IEAWindTask37,,,,,https://avatars.githubusercontent.com/u/36546446?v=4,,,
windpowerlib,A library to model the output of wind turbines and farms.,wind-python,https://github.com/wind-python/windpowerlib.git,github,"wind,energy,power,model,modelling",Wind Energy,"2023/04/12, 19:01:00",271,25,44,true,Python,,wind-python,"Python,Jupyter Notebook",https://oemof.org/,"b'.. image:: https://travis-ci.org/wind-python/windpowerlib.svg?branch=dev\n :target: https://travis-ci.org/wind-python/windpowerlib\n.. image:: https://coveralls.io/repos/github/wind-python/windpowerlib/badge.svg?branch=dev\n :target: https://coveralls.io/github/wind-python/windpowerlib?branch=dev\n.. image:: https://zenodo.org/badge/DOI/10.5281/zenodo.824267.svg\n :target: https://doi.org/10.5281/zenodo.824267\n.. image:: https://mybinder.org/badge_logo.svg\n :target: https://mybinder.org/v2/gh/wind-python/windpowerlib/dev?filepath=example\n.. image:: https://img.shields.io/badge/code%20style-black-000000.svg\n :target: https://github.com/psf/black\n\n.. image:: https://img.shields.io/lgtm/grade/python/g/wind-python/windpowerlib.svg?logo=lgtm&logoWidth=18\n :target: https://lgtm.com/projects/g/wind-python/windpowerlib/context:python\n \nIntroduction\n=============\n\nThe windpowerlib is a library that provides a set of functions and classes to calculate the power output of wind turbines. It was originally part of the \n`feedinlib `_ (windpower and photovoltaic) but was taken out to build up a community concentrating on wind power models.\n\nFor a quick start see the `Examples and basic usage `_ section.\n\n\nDocumentation\n==============\n\nFull documentation can be found at `readthedocs `_.\n\nUse the `project site `_ of readthedocs to choose the version of the documentation. \nGo to the `download page `_ to download different versions and formats (pdf, html, epub) of the documentation.\n\n\nInstallation\n============\n\nIf you have a working Python 3 (>= 3.6) environment, use pypi to install the latest windpowerlib version:\n\n::\n\n pip install windpowerlib\n\nThe windpowerlib is designed for Python 3 and tested on Python >= 3.5. We highly recommend to use virtual environments.\nPlease see the `installation page `_ of the oemof documentation for complete instructions on how to install python and a virtual environment on your operating system.\n\nOptional Packages\n~~~~~~~~~~~~~~~~~\n\nTo see the plots of the windpowerlib example in the `Examples and basic usage `_ section you should `install the matplotlib package `_.\nMatplotlib can be installed using pip:\n\n::\n\n pip install matplotlib\n\n.. _examplereference-label:\n\nExamples and basic usage\n=========================\n\nThe simplest way to run the example notebooks without installing windpowerlib is to click `here `_ and open them with Binder.\n\nThe basic usage of the windpowerlib is shown in the `ModelChain example `_ that is available as jupyter notebook and python script:\n\n* `ModelChain example (Python script) `_\n* `ModelChain example (Jupyter notebook) `_\n\nTo run the example you need example weather that is downloaded automatically and can also be downloaded here:\n\n* `Example weather data file `_\n\nTo run the examples locally you have to install the windpowerlib. To run the notebook you also need to install `notebook` using pip3. To launch jupyter notebook type ``jupyter notebook`` in the terminal.\nThis will open a browser window. Navigate to the directory containing the notebook to open it. See the jupyter notebook quick start guide for more information on `how to install `_ and\n`how to run `_ jupyter notebooks. In order to reproduce the figures in a notebook you need to install `matplotlib`.\n\nFurther functionalities, like the modelling of wind farms and wind turbine clusters, are shown in the `TurbineClusterModelChain example `_. As the ModelChain example it is available as jupyter notebook and as python script. The weather used in this example is the same as in the ModelChain example.\n\n* `TurbineClusterModelChain example (Python script) `_\n* `TurbineClusterModelChain example (Jupyter notebook) `_\n\nYou can also look at the examples in the `Examples section `_.\n\nWind turbine data\n==================\n\nThe windpowerlib provides data of many wind turbines but it is also possible to\nuse your own turbine data.\n\nUse internal data\n~~~~~~~~~~~~~~~~~\n\nThe windpowerlib provides `wind turbine data `_\n(power curves, hub heights, etc.) for a large set of wind turbines. See `Initialize wind turbine` in `Examples section `_ on how\nto use this data in your simulations.\n\nThe dataset is hosted and maintained on the `OpenEnergy database `_ (oedb).\nTo update your local files with the latest version of the `oedb turbine library `_ you can execute the following in your python console:\n\n.. code:: python\n\n from windpowerlib.data import store_turbine_data_from_oedb\n store_turbine_data_from_oedb()\n\nIf you find your turbine in the database it is very easy to use it in the\nwindpowerlib\n\n.. code:: python\n\n from windpowerlib import WindTurbine\n enercon_e126 = {\n ""turbine_type"": ""E-126/4200"", # turbine type as in register\n ""hub_height"": 135, # in m\n }\n e126 = WindTurbine(**enercon_e126)\n\nWe would like to encourage anyone to contribute to the turbine library by adding turbine data or reporting errors in the data.\nSee `the OEP `_ for more information on how to contribute.\n\nUse your own turbine data\n~~~~~~~~~~~~~~~~~~~~~~~~~\n\nIt is possible to use your own power curve. However, the most sustainable way\nis to send us the data to be included in the windpowerlib and to be available\nfor all users. This may not be possible in all cases.\n\nAssuming the data files looks like this:\n\n.. code::\n\n wind,power\n 0.0,0.0\n 3.0,39000.0\n 5.0,270000.0\n 10.0,2250000.0\n 15.0,4500000.0\n 25.0,4500000.0\n\nYou can use pandas to read the file and pass it to the turbine dictionary. I\nyou have basic knowledge of pandas it is easy to use any kind of data file.\n\n.. code:: python\n\n import pandas as pd\n from windpowerlib import WindTurbine, create_power_curve\n my_data = pd.read_csv(""path/to/my/data/file.csv"")\n\n my_turbine_data = {\n ""nominal_power"": 6e6, # in W\n ""hub_height"": 115, # in m\n ""power_curve"": create_power_curve(\n wind_speed=my_data[""wind""], power=my_data[""power""]\n ),\n }\n\n my_turbine = WindTurbine(**my_turbine_data)\n\nSee the `modelchain_example` for more information.\n\nContributing\n==============\n\nWe are warmly welcoming all who want to contribute to the windpowerlib. If you are interested in wind models and want to help improving the existing model do not hesitate to contact us via github or email (windpowerlib@rl-institut.de).\n\nClone: https://github.com/wind-python/windpowerlib and install the cloned repository using pip:\n\n.. code:: bash\n\n pip install -e /path/to/the/repository\n\nAs the windpowerlib started with contributors from the `oemof developer group `_ we use the same\n`developer rules `_.\n\n**How to create a pull request:**\n\n* `Fork `_ the windpowerlib repository to your own github account.\n* Change, add or remove code.\n* Commit your changes.\n* Create a `pull request `_ and describe what you will do and why.\n* Wait for approval.\n\n**Generally the following steps are required when changing, adding or removing code:**\n\n* Add new tests if you have written new functions/classes.\n* Add/change the documentation (new feature, API changes ...).\n* Add a whatsnew entry and your name to Contributors.\n* Check if all tests still work by simply executing pytest in your windpowerlib directory:\n\n.. role:: bash(code)\n :language: bash\n\n.. code:: bash\n\n pytest\n\nCiting the windpowerlib\n========================\n\nWe use the zenodo project to get a DOI for each version. `Search zenodo for the right citation of your windpowerlib version `_.\n\nLicense\n============\n\nCopyright (c) 2019 oemof developer group\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the ""Software""), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED ""AS IS"", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n'",",https://doi.org/10.5281/zenodo.824267\n,https://zenodo.org/search?page=1&size=20&q=windpowerlib","2016/08/26, 13:50:35",2616,MIT,2,1527,"2023/10/04, 14:52:13",23,52,108,2,21,9,2.2,0.37207357859531776,"2021/03/09, 16:25:46",v0.2.1,0,9,false,,false,false,"fgbg03/PROCSIM-Running-Results-Frontend,T1mTop1a/team_project_wonders_of_wind,YuTian8328/WindAtlas,pdb-94/miguel,in-RET/inretensys-fastapi,TAMUparametric/energiapy,YuTian8328/FMIWindMap,UU-ER/EHUB-Py_Training,patsec/ot-sim,NOWUM/dmas,oemof/oemof,remmyd/HRESopt,moritz-reuter/ESEM-EE,dpinney/wiires,Runamook/iec_62056-21,Adrianonsare/Geospatial-PowerAnalysis,Adrianonsare/EnergyAnalytics,rohitsanam/streamlit-basic-app,montefesp/EPIPPy,thesethtruth/LESO,brizett/reegis_hp,Pyosch/vpplib,open-fred/lib_validation,reegis/reegis,oemof/feedinlib",,https://github.com/wind-python,,,,,https://avatars.githubusercontent.com/u/21263042?v=4,,,
turbinesFoam,A library for simulating wind and marine hydrokinetic turbines in OpenFOAM using the actuator line method.,turbinesFoam,https://github.com/turbinesFoam/turbinesFoam.git,github,"openfoam,turbines,airfoils,blade-element,actuator-line,wind-energy",Wind Energy,"2022/05/16, 14:48:37",76,0,13,true,C++,turbinesFoam,turbinesFoam,"C++,C,Python,Roff,Shell",,"b""turbinesFoam\n============\n\n[](https://app.travis-ci.com/turbinesFoam/turbinesFoam)\n\n\n\n[](https://zenodo.org/badge/latestdoi/4234/turbinesFoam/turbinesFoam)\n\nturbinesFoam is a library for simulating wind and marine hydrokinetic turbines\nin OpenFOAM using the actuator line method.\n\n[](https://www.youtube.com/watch?v=THZvV4R1vow)\n\nBe sure to check out the\n[development snapshot videos on YouTube](https://www.youtube.com/playlist?list=PLOlLyh5gytG8n8D3V1lDeZ3e9fJf9ux-e).\n\n\nContributing\n------------\n\nPull requests are very welcome!\nSee the [issue tracker](https://github.com/petebachant/turbinesFoam/issues)\nfor more details.\n\n\nFeatures\n--------\n\n`fvOptions` classes for adding actuator lines and turbines constructed from\nactuator lines to any compatible solver or turbulence model, e.g.,\n`simpleFoam`, `pimpleFoam`, `interFoam`, etc.\n\n\nInstallation\n------------\n\n```bash\ncd $WM_PROJECT_USER_DIR\ngit clone https://github.com/turbinesFoam/turbinesFoam.git\ncd turbinesFoam\n./Allwmake\n```\n\n\nUsage\n-----\n\nThere are tutorials located in `turbinesFoam/tutorials`.\n\n\nPublications\n------------\n\nBachant, P., Goude, A., and Wosnik, M. (2016) [_Actuator line modeling of vertical-axis turbines_](https://arxiv.org/abs/1605.01449). arXiv preprint 1605.01449.\n\n\nHow to cite\n-----------\n\nThe latest release of turbinesFoam can be cited via DOI thanks to Zenodo: [](https://zenodo.org/badge/latestdoi/4234/turbinesFoam/turbinesFoam)\n\n\nAcknowledgements\n----------------\n\nThis work was funded through a National Science Foundation CAREER award,\nprincipal investigator Martin Wosnik ([NSF CBET\n1150797](http://www.nsf.gov/awardsearch/showAward?AWD_ID=1150797), Energy for\nSustainability, original program manager Geoffrey A. Prentice, current program\nmanager Gregory L. Rorrer).\n\nOpenFOAM is free, open source software for computational fluid dynamics (CFD),\ndeveloped primarily by [CFD Direct](http://cfd.direct), on behalf of the\n[OpenFOAM](http://openfoam.org) Foundation.\n\nInterpolation, Gaussian projection, and vector rotation functions adapted from\nNREL's [SOWFA](https://github.com/NREL/SOWFA).\n""",",https://zenodo.org/badge/latestdoi/4234/turbinesFoam/turbinesFoam,https://arxiv.org/abs/1605.01449,https://zenodo.org/badge/latestdoi/4234/turbinesFoam/turbinesFoam","2014/06/24, 19:30:49",3410,CUSTOM,0,1189,"2023/06/16, 15:59:05",138,59,220,1,131,7,1.2,0.02930402930402931,"2019/11/15, 01:53:16",v0.1.1,0,5,false,,false,false,,,https://github.com/turbinesFoam,,,,,https://avatars.githubusercontent.com/u/10404114?v=4,,,
nalu-wind,Solver for wind farm simulations targeting exascale computational platforms.,Exawind,https://github.com/Exawind/nalu-wind.git,github,"low-mach,wind-energy,les,cfd,ecp,exascale-computing,exawind",Wind Energy,"2023/10/24, 22:47:17",102,0,24,true,C,exawind,Exawind,"C,C++,CMake,Fortran,Python,Shell",https://nalu-wind.readthedocs.io,"b'# Nalu-Wind \n\n[Website](https://www.exawind.org/) | [Documentation](https://nalu-wind.readthedocs.io) | [Nightly test dashboard](http://my.cdash.org/index.php?project=Exawind) \n\nNalu-Wind is a generalized, unstructured, massively parallel, incompressible\nflow solver for wind turbine and wind farm simulations. The codebase is a\nwind-focused fork of [NaluCFD](https://github.com/NaluCFD/Nalu); NaluCFD is developed \nand maintained by Sandia National Laboratories. Nalu-Wind is being actively\ndeveloped and maintained by a dedicated, multi-institutional team from [National\nRenewable Energy Laboratory](https://nrel.gov), [Sandia National\nLaboratories](https://sandia.gov), and [Univ. of Texas Austin](https://utexas.edu).\n\nNalu-Wind is developed as an open-source code with the following objectives: \n\n- an open, well-documented implementation of the state-of-the-art computational\n models for modeling wind farm flow physics at various fidelities that are\n backed by a comprehensive verification and validation (V&V) process;\n\n- be capable of performing the highest-fidelity simulations of flowfields within\n wind farms; and \n\n- be able to leverage the high-performance leadership class computating\n facilities available at DOE national laboratories.\n\nWe hope that this community developed model will be used by research\nlaboratories, academia, and industry to develop the next-generation of wind farm\ntechnologies. We welcome the wind energy community to use Nalu-Wind in their\nresearch. When disseminating technical work that includes Nalu-Wind simulations\nplease reference the following citation:\n\n Sprague, M. A., Ananthan, S., Vijayakumar, G., Robinson, M., ""ExaWind: A multifidelity \n modeling and simulation environment for wind energy"", NAWEA/WindTech 2019 Conference, \n Amherst, MA, 2019.\n\n## Documentation\n\nDocumentation is available online at https://nalu-wind.readthedocs.io/ and is\nsplit into the following sections:\n\n- [Theory manual](https://nalu-wind.readthedocs.io/en/latest/source/theory/index.html):\n This section provides a detailed overview of the supported equation sets, the\n discretization and time-integration schemes, turbulence models available, etc.\n \n- [Verification manual](https://nalu-wind.readthedocs.io/en/latest/source/verification/index.html):\n This section documents the results from verification studies of the spatial\n and temporal schemes available in Nalu-Wind.\n \n- [User manual](https://nalu-wind.readthedocs.io/en/latest/source/user/index.html):\n The user manual contains detailed instructions on building the code, along\n with the required third-party libraries (TPLs) and usage.\n \nAll documentation is maintained alongside the source code within the git\nrepository and automatically deployed to ReadTheDocs website upon new commits.\n \n## Compilation and usage\n\nNalu-Wind is primarily built upon the packages provided by the [Trilinos\nproject](https://trilinos.org), which in turn depends on several third-party\nlibraries (MPI, HDF5, NetCDF, parallel NetCDF), and YAML-CPP. In addition, it\nhas the following optional dependencies: hypre, TIOGA, and OpenFAST. Detailed\nbuild instructions are available in the [user\nmanual](https://nalu-wind.readthedocs.io/en/latest/source/user/building.html).\nWe recommend using [Spack](https://spack.io/) package manager to install\nNalu-Wind on your system.\n\n### Testing and quality assurance\n\nNalu-Wind comes with a comprehensive unit test and regression test suite that\nexercise almost all major components of the code. The `master` branch is\ncompiled and run through a regression test suite with different compilers\n([GCC](https://gcc.gnu.org/), [LLVM/Clang](https://clang.llvm.org/), and\n[Intel](https://software.intel.com/en-us/compilers)) on Linux and MacOS\noperating systems, against both the `master` and `develop` branches of\n[Trilinos](https://github.com/trilinos/Trilinos). Tests are performed both using\nflat MPI and hybrid MPI-GPU hardware configurations. The results of the nightly\ntesting are publicly available on [CDash\ndashboard](http://my.cdash.org/index.php?project=Nalu-Wind).\n\n### Contributing, reporting bugs, and requesting help\n\nTo report issues or bugs please [create a new\nissue](https://github.com/Exawind/nalu-wind/issues/new) on GitHub.\n\nWe welcome contributions from the community in form of bug fixes, feature\nenhancements, documentation updates, etc. All contributions are processed\nthrough pull-requests on GitHub. Please follow our [contributing\nguidelines](https://github.com/Exawind/nalu-wind/blob/master/CONTRIBUTING.md)\nwhen submitting pull-requests.\n \n## License\n\nNalu-Wind is licensed under BSD 3-clause license. Please see the\n[LICENSE](https://github.com/Exawind/nalu-wind/blob/master/LICENSE) included in\nthe source code repository for more details.\n\n## Acknowledgements \n\nNalu-Wind is currently being developed with funding from Department of Energy\'s\n(DOE) Office of Science [Exascale Computing Project\n(ECP)](https://www.exascaleproject.org/) and Energy Efficiency and Renewable\nEnergy (EERE) Wind Energy Technology Office (WETO). Please see [authors\nfile](https://github.com/Exawind/nalu-wind/blob/master/AUTHORS) for a \nlist of contributors to Nalu-Wind. \n'",,"2018/05/03, 15:39:32",2001,CUSTOM,136,2376,"2023/10/24, 22:05:53",60,919,1167,153,0,13,1.3,0.8009259259259259,"2021/03/21, 18:48:03",v1.3.0,0,47,false,,false,true,,,https://github.com/Exawind,,,,,https://avatars.githubusercontent.com/u/22328575?v=4,,,
openfast,"A multi-physics, multi-fidelity tool for simulating the coupled dynamic response of wind turbines and wind farms.",OpenFAST,https://github.com/OpenFAST/openfast.git,github,"wind-turbine,wind-energy,wind-farm,aeroelasticity,wind-power,wind",Wind Energy,"2023/10/20, 17:37:40",542,0,132,true,Fortran,OpenFAST,OpenFAST,"Fortran,C++,C,Python,CMake,MATLAB,Makefile,Batchfile,Shell,Dockerfile",http://openfast.readthedocs.io,"b'OpenFAST\n========\n\n|actions| |nbsp| |rtfd|\n\n.. |actions| image:: https://github.com/openfast/openfast/actions/workflows/automated-dev-tests.yml/badge.svg?branch=dev\n :target: https://github.com/OpenFAST/openfast/actions/workflows/automated-dev-tests.yml?query=workflow%3A%22Development+Pipeline%22\n :alt: Build Status\n.. |rtfd| image:: https://readthedocs.org/projects/openfast/badge/?version=dev\n :target: https://openfast.readthedocs.io/en/dev\n :alt: Documentation Status\n.. |nbsp| unicode:: 0xA0\n :trim:\n\nOpenFAST is a wind turbine simulation tool which builds on FAST v8. FAST.Farm\nextends the capability of OpenFAST to simulate multi-turbine wind farms. They were\ncreated with the goal of being community models developed and used by research\nlaboratories, academia, and industry. They are managed by a dedicated team at the\nNational Renewable Energy Lab. Our objective is to ensure that OpenFAST and FAST.Farm\nare sustainable software that are well tested and well documented. If you\'d like\nto contribute, see the `Developer Documentation `_\nand any open GitHub issues with the\n`Help Wanted `_\ntag.\n\n**OpenFAST is under active development**.\n\nFAST v8 - OpenFAST\n------------------\nThe transition from FAST v8 to OpenFAST represents the effort to better\nsupport an open-source developer community around FAST-based aero-hydro-servo-\nelastic engineering models of wind-turbines and wind-plants. OpenFAST is the\nnext generation of FAST analysis tools. More information is available in the\n`transition notes `_.\n\nFAST v8, now OpenFAST, is a physics-based engineering tool for simulating the coupled dynamic\nresponse of wind turbines. OpenFAST joins aerodynamics models, hydrodynamics models\nfor offshore structures, control and electrical system (servo) dynamics models,\nand structural (elastic) dynamics models to enable coupled nonlinear aero-\nhydro-servo-elastic simulation in the time domain. The OpenFAST tool enables the\nanalysis of a range of wind turbine configurations, including two- or\nthree-blade horizontal-axis rotor, pitch or stall regulation, rigid or\nteetering hub, upwind or downwind rotor, and lattice or tubular tower. The wind\nturbine can be modeled on land or offshore on fixed-bottom or floating\nsubstructures. OpenFAST is based on advanced engineering models derived from\nfundamental laws, but with appropriate simplifications and assumptions, and\nsupplemented where applicable with computational solutions and test data.\n\nWith OpenFAST, you can run large numbers of nonlinear time-domain simulations\nin approximately real time to enable standards-based loads analysis for predicting\nwind system ultimate and fatigue loads. You can also linearize the underlying\nnonlinear model about an operating point to understand the system response\nand enable the calculation of natural frequencies, damping, and mode shapes;\nthe design of controllers, and analysis of aero-elastic instabilities.\n\nThe aerodynamic models use wind-inflow data and solve for the rotor-wake\neffects and blade-element aerodynamic loads, including dynamic stall. The\nhydrodynamics models simulate the regular or irregular incident waves and\ncurrents and solve for the hydrostatic, radiation, diffraction, and viscous\nloads on the offshore substructure. The control and electrical system models\nsimulate the controller logic, sensors, and actuators of the blade-pitch,\ngenerator-torque, nacelle-yaw, and other control devices, as well as the\ngenerator and power-converter components of the electrical drive. The\nstructural-dynamics models apply the control and electrical system\nreactions, apply the aerodynamic and hydrodynamic loads, adds gravitational\nloads, and simulate the elasticity of the rotor, drivetrain, and support\nstructure. Coupling between all models is achieved through a modular\ninterface and coupler (glue code).\n\nFAST.Farm extends the capabilities of OpenFAST to provide physics-based\nengineering simulation of multi-turbine land-based, fixed-bottom offshore,\nand floating offshore wind farms. With FAST.Farm, you can simulate each wind\nturbine in the farm with an OpenFAST model and capture the relevant\nphysics for prediction of wind farm power performance and structural loads,\nincluding wind farm-wide ambient wind, super controller, and wake advection,\nmeandering, and merging. FAST.Farm maintains computational efficiency\nthrough parallelization to enable loads analysis for predicting the ultimate\nand fatigue loads of each wind turbine in the farm.\n\n\nDocumentation\n-------------\nThe full documentation is available at http://openfast.readthedocs.io/.\n\nThis documentation is stored and maintained alongside the source code.\nIt is compiled into HTML with Sphinx and is tied to a particular version\nof OpenFAST. `Readthedocs `_ hosts the following\nversions of the documentation:\n\n* ``latest`` - The latest commit on the ``main`` branch\n* ``stable`` - Corresponds to the last tagged release\n* ``dev`` - The latest commit on the ``dev`` branch\n\nThese can be toggled with the ``v: latest`` button in the lower left corner of\nthe docs site.\n\nObtaining OpenFAST and FAST.Farm\n--------------------------------\nOpenFAST and FAST.Farm are hosted entirely on GitHub so you are in the\n`right place `_!\nThe repository is structured with two branches following the\n""git-flow"" convention:\n\n* ``main``\n* ``dev``\n\nThe ``main`` branch is stable, well tested, and represents the most up to\ndate released versions of OpenFAST and FAST.Farm. The latest commit on ``main``\ncontains a tag with version info and brief release notes. The tag history can be\nobtained with the ``git tag`` command and viewed in more detail on\n`GitHub Releases `_. For general\nuse, the ``main`` branch is highly recommended.\n\nThe ``dev`` branch is generally stable and tested, but not static. It contains\nnew features, bug fixes, and documentation updates that have not been compiled\ninto a production release. Before proceeding with new development, it is\nrecommended to explore the ``dev`` branch. This branch is updated regularly\nthrough pull requests, so be sure to ``git fetch`` often and check\n`outstanding pull requests `_.\n\nFor those not familiar with git and GitHub, there are many resources:\n\n* https://guides.github.com\n* https://try.github.io\n* https://help.github.com/categories/bootcamp/\n* https://desktop.github.com/\n* http://nvie.com/posts/a-successful-git-branching-model/\n\nCompilation, Usage, and Development\n-----------------------------------\nDetails for compiling\n`compiling `_,\n`using `_, and\n`developing `_\nOpenFAST and FAST.Farm on Unix-based and Windows machines are available at\n`readthedocs `_.\n\nHelp\n----\nPlease use `GitHub Issues `_ to:\n\n* ask usage questions\n* report bugs\n* request code enhancements\n\nUsers and developers may also be interested in the NREL National Wind\nTechnology Center (NWTC) `phpBB Forum `_,\nwhich is still maintained and has a long history of FAST-related questions\nand answers.\n\nAcknowledgments\n---------------\n\nOpenFAST and FAST.Farm are maintained and developed by researchers and software\nengineers at the `National Renewable Energy Laboratory `_\n(NREL), with support from the US Department of Energy\'s Wind Energy Technology\nOffice. NREL gratefully acknowledges development contributions from the following\norganizations:\n\n* Envision Energy USA, Ltd\n* Brigham Young University\n* The University of Massachusetts\n* `Intel\xc2\xae Parallel Computing Center (IPCC) `_\n'",,"2016/08/31, 20:07:10",2610,Apache-2.0,522,9558,"2023/10/25, 13:02:44",245,622,1327,358,0,8,2.0,0.6383053221288515,"2023/10/20, 17:55:43",v3.5.1,0,55,false,,false,false,,,https://github.com/OpenFAST,https://openfast.readthedocs.io,,,,https://avatars.githubusercontent.com/u/15838605?v=4,,,
amr-wind,"A massively parallel, block-structured adaptive-mesh, incompressible flow solver for wind turbine and wind farm simulations.",Exawind,https://github.com/Exawind/amr-wind.git,github,"ecp,exascale-computing,amrex,amr,wind,wind-turbines",Wind Energy,"2023/10/20, 16:12:21",82,0,37,true,C++,exawind,Exawind,"C++,CMake,Python,C,Makefile",https://exawind.github.io/amr-wind,"b'# AMR-Wind \n\n[Website](https://www.exawind.org/) | [User manual](https://exawind.github.io/amr-wind) | [API docs](https://exawind.github.io/amr-wind/api_docs) | [Nightly test dashboard](http://my.cdash.org/index.php?project=Exawind) \n\n[](https://amrex-codes.github.io/amrex/) [](https://github.com/Exawind/amr-wind/actions)\n\n\nAMR-Wind is a massively parallel, block-structured adaptive-mesh, incompressible\nflow solver for wind turbine and wind farm simulations. The codebase is a\nwind-focused fork of [incflo](https://github.com/AMReX-Codes/incflo). The solver\nis built on top of the [AMReX library](https://amrex-codes.github.io/amrex).\nAMReX library provides the mesh data structures, mesh adaptivity, as well as the\nlinear solvers used for solving the governing equations. AMR-Wind is actively\ndeveloped and maintained by a dedicated multi-institutional team from [Lawrence\nBerkeley National Laboratory](https://www.lbl.gov/), [National Renewable Energy\nLaboratory](https://nrel.gov), and [Sandia National\nLaboratories](https://sandia.gov).\n\nThe primary applications for AMR-Wind are: performing large-eddy simulations\n(LES) of atmospheric boundary layer (ABL) flows, simulating wind farm\nturbine-wake interactions using actuator disk or actuator line models for\nturbines, and as a background solver when coupled with a near-body solver (e.g.,\n[Nalu-Wind](https://github.com/exawind/nalu-wind)) with overset methodology to\nperform blade-resolved simulations of multiple wind turbines within a wind farm.\nFor offshore applications, the ability to model the air-sea interaction effects\nand its impact on the ABL characteristics is another focus for the code\ndevelopment effort. As with other codes in the\n[Exawind](https://github.com/exawind) ecosystem, AMR-wind shares the following\nobjectives:\n\n- an open, well-documented implementation of the state-of-the-art computational\n models for modeling wind farm flow physics at various fidelities that are\n backed by a comprehensive verification and validation (V&V) process;\n\n- be capable of performing the highest-fidelity simulations of flowfields within\n wind farms; and \n\n- be able to leverage the high-performance leadership class computating\n facilities available at DOE national laboratories.\n\n## Documentation\n\nDocumentation is organized into a [user manual](https://exawind.github.io/amr-wind)\nand a developer-focused [API\ndocumentation](https://exawind.github.io/amr-wind). You can either\nbrowse the docs online by following the links, or you can generate them locally\nafter downloading the code. Please follow the instructions in user manual to\nbuild documentation locally.\n\n## Compilation and usage\n\nAMR-Wind is built upon the [AMReX library](https://amrex-codes.github.io/amrex).\nA snapshot of the AMReX library is distributed along with the AMR-Wind source\ncode as a `git-submodule`. In addition to the AMReX library, you will require a\nmodern C++ compiler that supports the C++17 standard. Users wishing to execute\nthe code on high-performance computing (HPC) systems will also need MPI\nlibraries installed on their system. The code can also be compiled using MPI+X, \nwhere X can be OpenMP for CPU shared memory parallelism,\nCUDA to target NVIDIA GPUs, HIP for AMD GPUs, or DPC++ for Intel GPUs.\n\n### Contributing, reporting bugs, and requesting help\n\nTo report issues or bugs please [create a new\nissue](https://github.com/Exawind/amr-wind/issues/new) on GitHub.\n\nWe welcome contributions from the community in form of bug fixes, feature\nenhancements, documentation updates, etc. All contributions are processed\nthrough pull-requests on GitHub.\n\n## License\n\nAMR-Wind is licensed under BSD 3-clause license. Please see the\n[LICENSE](https://github.com/Exawind/amr-wind/blob/development/LICENSE) included in\nthe source code repository for more details.\n\n'",,"2019/11/04, 19:10:43",1451,CUSTOM,149,2005,"2023/10/20, 16:12:26",34,788,879,235,5,9,0.8,0.7905686546463245,,,0,40,false,,false,false,,,https://github.com/Exawind,,,,,https://avatars.githubusercontent.com/u/22328575?v=4,,,
QBlade,Provides a hands-on design and simulation capabilities for HAWT and VAWT rotor design and shows all the fundamental relationships of design concepts and turbine performance in an easy and intuitive way.,,,custom,,Wind Energy,,,,,,,,,,http://www.q-blade.org/#welcome,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
OpenOA,"This library provides a framework for working with large time series data from wind plants, such as SCADA.",NREL,https://github.com/NREL/OpenOA.git,github,,Wind Energy,"2023/09/29, 23:19:35",153,6,27,true,Jupyter Notebook,National Renewable Energy Laboratory,NREL,"Jupyter Notebook,Python,TeX,Batchfile,Makefile,CSS",https://openoa.readthedocs.io/,"b'
\n\n[](https://mybinder.org/v2/gh/NREL/OpenOA/develop_v3?filepath=examples) [](https://gitter.im/NREL_OpenOA/community?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge) [](https://joss.theoj.org/papers/d635ef3c3784d49f6e81e07a0b35ff6b)\n\n[](https://openoa.readthedocs.io)  [](https://codecov.io/gh/NREL/OpenOA)\n\n[](https://github.com/pre-commit/pre-commit) [](https://github.com/psf/black) [](https://pycqa.github.io/isort/)\n\n-----\n\nThis library provides a framework for working with large timeseries data from wind plants, such as SCADA.\nIts development has been motivated by the WP3 Benchmarking (PRUF) project,\nwhich aims to provide a reference implementation for plant-level performance assessment.\n\nAnalysis routines are grouped by purpose into methods,\nand these methods in turn rely on more abstract toolkits.\nIn addition to the provided analysis methods,\nanyone can write their own, which is intended to provide natural\ngrowth of tools within this framework.\n\nThe library is written around Pandas Data Frames, utilizing a flexible backend\nso that data loading, processing, and analysis could be performed using other libraries,\nsuch as Dask and Spark, in the future.\n\nIf you would like to try out the code before installation or simply explore the possibilities, please see our examples on [Binder](https://mybinder.org/v2/gh/NREL/OpenOA/develop_v3?filepath=examples).\n\nIf you use this software in your work, please cite our JOSS article with the following BibTex:\n\n```\n@article{Perr-Sauer2021,\n doi = {10.21105/joss.02171},\n url = {https://doi.org/10.21105/joss.02171},\n year = {2021},\n publisher = {The Open Journal},\n volume = {6},\n number = {58},\n pages = {2171},\n author = {Jordan Perr-Sauer and Mike Optis and Jason M. Fields and Nicola Bodini and Joseph C.Y. Lee and Austin Todd and Eric Simley and Robert Hammond and Caleb Phillips and Monte Lunacek and Travis Kemper and Lindy Williams and Anna Craig and Nathan Agarwal and Shawn Sheng and John Meissner},\n title = {OpenOA: An Open-Source Codebase For Operational Analysis of Wind Farms},\n journal = {Journal of Open Source Software}\n}\n```\n\n### Requirements\n\n- Python 3.8, 3.9, or 3.10 with pip.\n\nWe strongly recommend using the Anaconda Python distribution and creating a new conda environment for OpenOA. You can download Anaconda through [their website.](https://www.anaconda.com/products/individual)\n\nAfter installing Anaconda, create and activate a new conda environment with the name ""openoa-env"":\n\n```bash\nconda create --name openoa-env python=3.10\nconda activate openoa-env\n```\n\n### Installation\n\nClone the repository and install the library and its dependencies using pip:\n\n```bash\ngit clone https://github.com/NREL/OpenOA.git\ncd OpenOA\npip install .\n```\n\nYou should now be able to import openoa from the Python interpreter:\n\n```bash\npython\n>>> import openoa\n>>> openoa.__version__\n```\n\n#### Common Installation Issues\n\n- In Windows you may get an error regarding geos_c.dll. To fix this install Shapely using:\n\n```bash\nconda install Shapely\n```\n\n- In Windows, an ImportError regarding win32api can also occur. This can be resolved by fixing the version of pywin32 as follows:\n\n```bash\npip install --upgrade pywin32==255\n```\n\n#### Example Notebooks and Data\n\nThe example data will be automaticaly extracted as needed by the tests. To manually extract the example data for use with the example notebooks, use the following command:\n\n```bash\nunzip examples/data/la_haute_borne.zip -d examples/data/la_haute_borne/\n```\n\nThe example notebooks are located in the `examples` directory. We suggest installing the Jupyter notebook server to run the notebooks interactively. The notebooks can also be viewed statically on [Read The Docs](http://openoa.readthedocs.io/en/latest/examples).\n\n```bash\njupyter lab # ""jupyter notebook"" is also ok if that\'s your preference\n```\n\n### Development\n\nPlease see the developer section of the contributing guide [here](contributing.md), or on the [documentation site](https://openoa.readthedocs.io/en/latest/getting_started/contributing.html) for complete details.\n\nDevelopment dependencies are provided through the develop extra flag in setup.py. Here, we install\nOpenOA, with development dependencies, in editable mode, and activate the pre-commit workflow (note:\nthis second step must be done before committing any changes):\n\n```bash\ncd OpenOA\npip install -e "".[develop, docs]""\npre-commit install\n```\n\nOccasionally, you will need to update the dependencies in the pre-commit workflow, which will provide an error when this needs to happen. When it does, this can normally be resolved with the below code, after which you can continue with your normal git workflow:\n\n```bash\npre-commit autoupdate\ngit add .pre-commit-config.yaml\n```\n\n#### Testing\nTests are written in the Python unittest or pytest framework and are runnable using pytest. There\nare two types of tests, unit tests (located in `test/unit`) run quickly and are automatically for\nevery pull request to the OpenOA repository. Regression tests (located at `test/regression`) provide\na comprehensive suite of scientific tests that may take a long time to run (up to 20 minutes on our\nmachines). These tests should be run locally before submitting a pull request, and are run weekly on\nthe develop and main branches.\n\nTo run all unit and regresison tests:\n\n```bash\npytest\n```\n\nTo run unit tests only:\n\n```bash\npytest test/unit\n```\n\nTo run all tests and generate a code coverage report\n\n```bash\npytest --cov=openoa\n```\n\n#### Documentation\n\nDocumentation is automatically built by, and visible through\n[Read The Docs](http://openoa.readthedocs.io/).\n\nYou can build the documentation with [sphinx](http://www.sphinx-doc.org/en/stable/), but will need\nto ensure [Pandoc is installed](https://pandoc.org/installing.html) on your computer first.\n\n```bash\ncd OpenOA\npip install -e "".[docs]""\ncd sphinx\nmake html\n```\n\n### Contributors\n\n\n\n\n\n\n\n\n\n[](#contributors)\n'",",https://doi.org/10.21105/joss.02171","2016/12/22, 18:16:30",2498,BSD-3-Clause,107,751,"2023/09/29, 23:19:35",8,151,250,48,25,0,4.8,0.6322701688555348,"2023/09/29, 23:24:46",v3.0,0,15,false,,false,true,"NREL/a2e2g,Riderwiesiek/CeneoWebScraper,entralliance/py-entr,paulf81/flasc_pl,NREL/flasc,vchaparro/wind-power-forecasting",,https://github.com/NREL,http://www.nrel.gov,"Golden, CO",,,https://avatars.githubusercontent.com/u/1906800?v=4,,,
ROSCO,NREL's Reference OpenSource Controller for wind turbine applications.,NREL,https://github.com/NREL/ROSCO.git,github,,Wind Energy,"2023/04/11, 16:13:40",77,0,27,true,Python,National Renewable Energy Laboratory,NREL,"Python,Fortran,Roff,Jupyter Notebook,MATLAB,F*,Scheme,CMake,C,Shell,M",https://rosco.readthedocs.io/en/latest/,"b'# NREL\'s Reference OpenSource Controller (ROSCO) toolbox for wind turbine applications\nNREL\'s Reference OpenSource Controller (ROSCO) for wind turbine applications is a toolset designed to ease controller use and implementation for the wind turbine researcher. Some primary capabilities include:\n* A reference controller with industry-standard functionality \n* Generic tuning of NREL\'s ROSCO controller\n* Simple 1-DOF turbine simulations for quick controller capability verifications\n* Parsing of OpenFAST input and output files\n\n\n## Introduction\nThe NREL Reference OpenSource Controller (ROSCO) provides an open, modular and fully adaptable baseline wind turbine controller to the scientific community. The ROSCO toolbox leverages this architecture and implementation to provide a generic tuning process for the controller. Because of the open character and modular set-up, scientists are able to collaborate and contribute in making continuous improvements to the code for the controller and the toolbox. The ROSCO controller is implemented in FORTRAN, while the remainder of the toolset is a mostly-python code base with a number of functionalities.\n\n* [ROSCO](https://github.com/NREL/ROSCO/tree/main/ROSCO) - the fortran source code for the ROSCO controller. \n* [Examples](https://github.com/NREL/ROSCO/tree/main/Examples) - short working examples of the capabilities of the ROSCO toolbox. \n* [Tune_Cases](https://github.com/NREL/ROSCO/tree/main/Tune_Cases) - example generic tuning scripts for a number of open-source reference turbines.\n* [Test_Cases](https://github.com/NREL/ROSCO/tree/main/Test_Cases) - numerous NREL 5MW bases cases to run for controller updates and comparisons. A ""test-suite"", if you will...\n* [Matlab_Toolbox](https://github.com/NREL/ROSCO/tree/main/Matlab_Toolbox) - MATLAB scripts to parse and plot simulation output data.\n* [ofTools](https://github.com/NREL/ROSCO/tree/main/ROSCO_toolbox/ofTools) - A number of scripts to facilitate usage of OpenFAST and manage OpenFAST input and output files. \n* [linear](https://github.com/NREL/ROSCO/tree/main/ROSCO_toolbox/linear) - Scripts to aid with the use of linear models for controller tuning and simplified simulation. \n\n\n## Documentation\nAll relevant documentation about the ROSCO toolbox and ROSCO controller can be found at through [ROSCO\'s readthedocs webpage](https://rosco.readthedocs.io/en/latest/). Here, users can find the information on [installing the ROSCO tools](https://rosco.readthedocs.io/en/latest/source/install.html) for control purposes. Additionally, there is information on the [standard workflow](https://rosco.readthedocs.io/en/latest/source/standard_use.html), details of the input files, use cases for the ROSCO tool-chain, and more. \n\n## Issues and Discussion\nIf you find issues with any of the code that resides in this repository, it is encouraged for you to open a [GitHub issue](https://github.com/NREL/ROSCO/issues). If you have general questions or comments regarding the code, please start a [discussion via GitHub](https://github.com/NREL/ROSCO/discussions). We encourage you to use these resources for all ROSCO-related questions and comments, rather than other resources such as the FAST forums. This helps us keep ROSCO-related items centralized, and provides a singular place for the community to look when they have questions that might arise. Please keep in mind that we will do our very best to respond in a timely manner, but may take a few days to get back to you if you catch us during a busy time. \n\n## Contributing\nIf it wasn\'t obvious from _open-source_ being in the title of the tool-set, this is an open-source code base that we would love for the community to contribute to. If you find yourself fixing any bugs, writing new routines, or even making small typo changes, please submit a [pull request](https://github.com/NREL/ROSCO/pulls). \n\n## Survey\nPlease help us better understand the ROSCO user-base and how we can improve rosco through this brief survey:\n[ROSCO toolchain survey](https://forms.office.com/Pages/ResponsePage.aspx?id=fp3yoM0oVE-EQniFrufAgGWnC45k8q5Kl90RBkHijqBUN0JTNzBJT1QwMjIzNDhCWDlDTUZPWDdMWC4u)\n\n## Referencing\nTo reference the ROSCO source code directly, please use the following DOI:\n[](https://zenodo.org/badge/latestdoi/220498357)\n\nIf the ROSCO Toolbox played a role in your research, please cite it. This software can be\ncited as:\n\n NREL: ROSCO. Version 2.4.1, https://github.com/NREL/ROSCO, 2021.\n\nFor LaTeX users:\n\n```\n@misc{ROSCO_toolbox_2021,\n author = {NREL},\n title = {{ROSCO. Version 2.4.1}},\n year = {2021},\n publisher = {GitHub},\n journal = {GitHub repository},\n url = {https://github.com/NREL/ROSCO}\n }\n```\nIf the ROSCO generic tuning theory and implementation played a roll in your research, please cite the following paper\n```\n@Article{wes-2021-19,\nAUTHOR = {Abbas, N. and Zalkind, D. and Pao, L. and Wright, A.},\nTITLE = {A Reference Open-Source Controller for Fixed and Floating Offshore Wind Turbines},\nJOURNAL = {Wind Energy Science Discussions},\nVOLUME = {2021},\nYEAR = {2021},\nPAGES = {1--33},\nURL = {https://wes.copernicus.org/preprints/wes-2021-19/},\nDOI = {10.5194/wes-2021-19}\n}\n```\n\n## Additional Contributors and Acknowledgments\nPrimary contributions to ROSCO have been provided by researchers the National Renewable Energy Laboratory and the University of Colorado Boulder. Additionally, the ROSCO controller was built upon the foundations of the [Delft Research Controller](https://github.com/TUDelft-DataDrivenControl/DRC_Fortran). Much of the intellect behind these contributions has been inspired or derived from an extensive amount of work in the literature. The bulk of this has been cited through the primary publications about this work. \n'",",https://zenodo.org/badge/latestdoi/220498357","2019/11/08, 15:47:14",1447,Apache-2.0,3,477,"2023/10/20, 06:31:52",11,136,214,74,5,4,0.6,0.25761772853185594,"2023/09/12, 15:45:48",raaw1.4,0,7,false,,false,false,,,https://github.com/NREL,http://www.nrel.gov,"Golden, CO",,,https://avatars.githubusercontent.com/u/1906800?v=4,,,
floris,A controls-oriented engineering wake modeling framework for evaluating the impact of wind farm controls on AEP and wind farm design.,NREL,https://github.com/NREL/floris.git,github,,Wind Energy,"2023/07/27, 19:49:18",158,0,49,true,Python,National Renewable Energy Laboratory,NREL,Python,http://nrel.github.io/floris,"b'# FLORIS Wake Modeling and Wind Farm Controls Software\n\nFLORIS is a controls-focused wind farm simulation software incorporating\nsteady-state engineering wake models into a performance-focused Python\nframework. It has been in active development at NREL since 2013 and the latest\nrelease is [FLORIS v3.4.1](https://github.com/NREL/floris/releases/latest).\nOnline documentation is available at https://nrel.github.io/floris.\n\nThe software is in active development and engagement with the development team\nis highly encouraged. If you are interested in using FLORIS to conduct studies\nof a wind farm or extending FLORIS to include your own wake model, please join\nthe conversation in [GitHub Discussions](https://github.com/NREL/floris/discussions/)!\n\n## Installation\n\n**If upgrading from v2, it is highly recommended to install FLORIS V3 into a new virtual environment**.\nInstalling into a Python environment that contains FLORIS v2 may cause conflicts.\nIf you intend to use [pyOptSparse](https://mdolab-pyoptsparse.readthedocs-hosted.com/en/latest/) with FLORIS,\nit is recommended to install that package first before installing FLORIS.\n\nFLORIS can be installed by downloading the source code or via the PyPI\npackage manager with `pip`.\n\nThe simplest method is with `pip` by using this command:\n\n```bash\npip install floris\n```\n\nDevelopers and anyone who intends to inspect the source code\ncan install FLORIS by downloading the git repository\nfrom GitHub with ``git`` and use ``pip`` to locally install it.\nIt is highly recommended to use a Python virtual environment manager\nsuch as [conda](https://docs.conda.io/en/latest/miniconda.html)\nin order to maintain a clean and sandboxed environment. The following\ncommands in a terminal or shell will download and install FLORIS.\n\n```bash\n # Download the source code from the `main` branch\n git clone -b main https://github.com/NREL/floris.git\n\n # If using conda, be sure to activate your environment prior to installing\n # conda activate \n\n # If using pyOptSpare, install it first\n conda install -c conda-forge pyoptsparse\n\n # Install FLORIS\n pip install -e floris\n```\n\nWith both methods, the installation can be verified by opening a Python interpreter\nand importing FLORIS:\n\n```python\n >>> import floris\n >>> help(floris)\n\n Help on package floris:\n\n NAME\n floris - # Copyright 2021 NREL\n\n PACKAGE CONTENTS\n logging_manager\n simulation (package)\n tools (package)\n turbine_library (package)\n type_dec\n utilities\n version\n\n VERSION\n 3.4\n\n FILE\n ~/floris/floris/__init__.py\n```\n\nIt is important to regularly check for new updates and releases as new\nfeatures, improvements, and bug fixes will be issued on an ongoing basis.\n\n## Quick Start\n\nFLORIS is a Python package run on the command line typically by providing\nan input file with an initial configuration. It can be installed with\n```pip install floris``` (see [installation](https://github.nrel.io/floris/installation)).\nThe typical entry point is\n[FlorisInterface](https://nrel.github.io/floris/_autosummary/floris.tools.floris_interface.FlorisInterface.html#floris.tools.floris_interface.FlorisInterface)\nwhich accepts the path to the input file as an argument. From there,\nchanges can be made to the initial configuration through the\n[FlorisInterface.reinitialize](https://nrel.github.io/floris/_autosummary/floris.tools.floris_interface.FlorisInterface.html#floris.tools.floris_interface.FlorisInterface.reinitialize)\nroutine, and the simulation is executed with\n[FlorisInterface.calculate_wake](https://nrel.github.io/floris/_autosummary/floris.tools.floris_interface.FlorisInterface.html#floris.tools.floris_interface.FlorisInterface.calculate_wake).\n\n```python\nfrom floris.tools import FlorisInterface\nfi = FlorisInterface(""path/to/input.yaml"")\nfi.reinitialize(wind_directions=[i for i in range(10)])\nfi.calculate_wake()\n```\n\nFinally, results can be analyzed via post-processing functions available within\n[FlorisInterface](https://nrel.github.io/floris/_autosummary/floris.tools.floris_interface.FlorisInterface.html#floris.tools.floris_interface.FlorisInterface)\nsuch as\n- [FlorisInterface.get_turbine_layout](https://nrel.github.io/floris/_autosummary/floris.tools.floris_interface.FlorisInterface.html#floris.tools.floris_interface.FlorisInterface.get_turbine_layout)\n- [FlorisInterface.get_turbine_powers](https://nrel.github.io/floris/_autosummary/floris.tools.floris_interface.FlorisInterface.html#floris.tools.floris_interface.FlorisInterface.get_turbine_powers)\n- [FlorisInterface.get_farm_AEP](https://nrel.github.io/floris/_autosummary/floris.tools.floris_interface.FlorisInterface.html#floris.tools.floris_interface.FlorisInterface.get_farm_AEP)\n\nand in a visualization package at [floris.tools.visualization](https://nrel.github.io/floris/_autosummary/floris.tools.floris_interface.FlorisInterface.html#floris.tools.visualization).\nA collection of examples describing the creation of simulations as well as\nanalysis and post processing are included in the\n[repository](https://github.com/NREL/floris/tree/main/examples)\nand described in detail in [Examples Index](https://github.nrel.io/floris/examples).\n\n## Engaging on GitHub\n\nFLORIS leverages the following GitHub features to coordinate support and development efforts:\n\n- [Discussions](https://github.com/NREL/floris/discussions): Collaborate to develop ideas for new use cases, features, and software designs, and get support for usage questions\n- [Issues](https://github.com/NREL/floris/issues): Report potential bugs and well-developed feature requests\n- [Projects](https://github.com/orgs/NREL/projects/18/): Include current and future work on a timeline and assign a person to ""own"" it\n\nGenerally, the first entry point for the community will be within one of the\ncategories in Discussions.\n[Ideas](https://github.com/NREL/floris/discussions/categories/ideas) is a great spot to develop the\ndetails for a feature request. [Q&A](https://github.com/NREL/floris/discussions/categories/q-a)\nis where to get usage support.\n[Show and tell](https://github.com/NREL/floris/discussions/categories/show-and-tell) is a free-form\nspace to show off the things you are doing with FLORIS.\n\n\n# License\n\nCopyright 2022 NREL\n\nLicensed under the Apache License, Version 2.0 (the ""License"");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an ""AS IS"" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n'",,"2019/04/01, 17:38:14",1668,Apache-2.0,97,1940,"2023/10/22, 12:41:50",71,338,470,118,3,16,1.9,0.41226472374013357,"2023/07/27, 19:54:40",v3.4.1,0,25,false,,false,true,,,https://github.com/NREL,http://www.nrel.gov,"Golden, CO",,,https://avatars.githubusercontent.com/u/1906800?v=4,,,
windtools,The Wind Energy Generation Tools provides useful tools to assist in wind energy simulations.,FZJ-IEK3-VSA,https://github.com/FZJ-IEK3-VSA/windtools.git,github,,Wind Energy,"2020/06/19, 12:12:52",5,0,1,false,Python,FZJ-IEK3,FZJ-IEK3-VSA,Python,,"b'\xef\xbb\xbf
\n\n# Wind Energy Generation Tools\n\nThe Wind Energy Generation Tools provides useful tools to assist in wind energy simulations.\n\nCurrent list of tools:\n * Synthetic Wind Turbine Power Curve Generator: \n Produces turbine power curves as as function of a turbine\'s specific capacity.\n\n\n---\n## Usage Examples\n* [Synthetic Power Curve](Examples/SyntheticPowerCurve.ipynb)\n\n---\n## Installation\n\nClone a local copy of the repository to your computer\n\n $ git clone https://github.com/FZJ-IEK3-VSA/windtools.git\n \nThen install via pip as follows \n \n $ cd /windtools\n $ pip install -e .\n \n\n---\n## Associated papers\n\n* [The future of European onshore wind energy potential: Detailed distribution and simulation of advanced turbine designs](https://linkinghub.elsevier.com/retrieve/pii/S0360544219311818)\n\n* [The Techno-Economic Potential of Offshore Wind Energy With Optimized Future Turbine Designs in Europe](https://www.preprints.org/manuscript/201902.0121/v1)\n\n---\n## Citation\n\nIf you decide to use ths module anywhere in a published work, please kindly cite us using the following\n\n```bibtex\n@article{Ryberg2019,\n author = {Ryberg, David Severin and Caglayan, Dilara Gulcin and Schmitt, Sabrina and Lin{\\ss}en, Jochen and Stolten, Detlef and Robinius, Martin},\n doi = {10.1016/j.energy.2019.06.052},\n issn = {03605442},\n journal = {Energy},\n month = {sep},\n pages = {1222--1238},\n title = {{The future of European onshore wind energy potential: Detailed distribution and simulation of advanced turbine designs}},\n url = {https://linkinghub.elsevier.com/retrieve/pii/S0360544219311818},\n volume = {182},\n year = {2019}\n}\n```\n\n---\n## License\n\nMIT License\n\nCopyright (c) 2017 David Severin Ryberg (FZJ IEK-3), Heidi Heinrichs (FZJ IEK-3), Martin Robinius (FZJ IEK-3), Detlef Stolten (FZJ IEK-3)\n\nYou should have received a copy of the MIT License along with this program. \nIf not, see \n\n## About Us \n
\n\nWe are the [Process and Systems Analysis](http://www.fz-juelich.de/iek/iek-3/EN/Forschung/_Process-and-System-Analysis/_node.html) department at the [Institute of Energy and Climate Research: Electrochemical Process Engineering (IEK-3)](http://www.fz-juelich.de/iek/iek-3/EN/Home/home_node.html) belonging to the Forschungszentrum J\xc3\xbclich. Our interdisciplinary department\'s research is focusing on energy-related process and systems analyses. Data searches and system simulations are used to determine energy and mass balances, as well as to evaluate performance, emissions and costs of energy systems. The results are used for performing comparative assessment studies between the various systems. Our current priorities include the development of energy strategies, in accordance with the German Federal Government\xe2\x80\x99s greenhouse gas reduction targets, by designing new infrastructures for sustainable and secure energy supply chains and by conducting cost analysis studies for integrating new technologies into future energy market frameworks.\n\n\n## Acknowledgment\n\nThis work was supported by the Helmholtz Association under the Joint Initiative [""Energy System 2050 \xe2\x80\x93 A Contribution of the Research Field Energy""](https://www.helmholtz.de/en/research/energy/energy_system_2050/).\n\n
\n'",,"2019/08/01, 10:06:34",1546,Apache-2.0,0,12,"2020/06/19, 12:12:53",1,1,1,0,1223,1,0.0,0.0,,,0,1,false,,false,false,,,https://github.com/FZJ-IEK3-VSA,https://www.fz-juelich.de/iek/iek-3/EN/Home/home_node.html,Forschungszentrum Jülich,,,https://avatars.githubusercontent.com/u/28654423?v=4,,,
PyWake,An AEP calculator for wind farms implemented in Python including a collection of wake models.,TOPFARM,,custom,,Wind Energy,,,,,,,,,,https://gitlab.windenergy.dtu.dk/TOPFARM/PyWake,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
WISDEM,Wind Plant Integrated System Design and Engineering Model.,WISDEM,https://github.com/WISDEM/WISDEM.git,github,"openmdao,systems-engineering,wisdem,wind",Wind Energy,"2023/09/27, 18:58:51",123,2,32,true,Python,WISDEM,WISDEM,"Python,C,Fortran,Meson,Shell,Makefile",https://www.nrel.gov/wind/systems-engineering.html,"b'# WISDEM®\n\n[](https://github.com/WISDEM/WISDEM/actions)\n[](https://coveralls.io/github/WISDEM/WISDEM?branch=develop)\n[](https://wisdem.readthedocs.io/en/master/?badge=master)\n\n\nThe Wind-Plant Integrated System Design and Engineering Model (WISDEM®) is a set of models for assessing overall wind plant cost of energy (COE). The models use wind turbine and plant cost and energy production as well as financial models to estimate COE and other wind plant system attributes. WISDEM® is accessed through Python, is built using [OpenMDAO](https://openmdao.org/), and uses several sub-models that are also implemented within OpenMDAO. These sub-models can be used independently but they are required to use the overall WISDEM® turbine design capability. Please install all of the pre-requisites prior to installing WISDEM® by following the directions below. For additional information about the NWTC effort in systems engineering that supports WISDEM® development, please visit the official [NREL systems engineering for wind energy website](https://www.nrel.gov/wind/systems-engineering.html).\n\nAuthor: [NREL WISDEM Team](mailto:systems.engineering@nrel.gov)\n\n## Documentation\n\nSee local documentation in the `docs`-directory or access the online version at \n\n## Packages\n\nWISDEM® is a family of modules. The core modules are:\n\n* _CommonSE_ includes several libraries shared among modules\n* _FloatingSE_ works with the floating platforms\n* _DrivetrainSE_ sizes the drivetrain and generator systems (formerly DriveSE and GeneratorSE)\n* _TowerSE_ is a tool for tower (and monopile) design\n* _RotorSE_ is a tool for rotor design\n* _NREL CSM_ is the regression-based turbine mass, cost, and performance model\n* _ORBIT_ is the process-based balance of systems cost model for offshore plants\n* _LandBOSSE_ is the process-based balance of systems cost model for land-based plants\n* _Plant_FinanceSE_ runs the financial analysis of a wind plant\n\nThe core modules draw upon some utility packages, which are typically compiled code with python wrappers:\n\n* _Airfoil Preppy_ is a tool to handle airfoil polar data\n* _CCBlade_ is the BEM module of WISDEM\n* _pyFrame3DD_ brings libraries to handle various coordinate transformations\n* _MoorPy_ is a quasi-static mooring line model\n* [_pyOptSparse_](https://github.com/mdolab/pyoptsparse) provides some additional optimization algorithms to OpenMDAO\n\n\n## Installation\n\nInstallation with [Anaconda](https://www.anaconda.com) is the recommended approach because of the ability to create self-contained environments suitable for testing and analysis. WISDEM® requires [Anaconda 64-bit](https://www.anaconda.com/distribution/). However, the `conda` command has begun to show its age and we now recommend the one-for-one replacement with `mamba` via the [Miniforge distribution](https://github.com/conda-forge/miniforge/releases), which is much more lightweight and more easily solves for the WISDEM package dependencies.\n\n### Installation as a ""library""\n\nTo use WISDEM\'s modules as a library for incorporation into other scripts or tools, WISDEM is available via `mamba install wisdem` or `pip install wisdem`, assuming that you have already setup your python environment. Note that on Windows platforms, we suggest using `conda/mamba` exclusively.\n\n### Installation for direct use\n\nThese instructions are for interaction with WISDEM directly, the use of its examples, and the direct inspection of its source code.\n\nThe installation instructions below use the environment name, ""wisdem-env,"" but any name is acceptable. For those working behind company firewalls, you may have to change the conda authentication with `conda config --set ssl_verify no`. Proxy servers can also be set with `conda config --set proxy_servers.http http://id:pw@address:port` and `conda config --set proxy_servers.https https://id:pw@address:port`. To setup an environment based on a different Github branch of WISDEM, simply substitute the branch name for `master` in the setup line.\n\n1. Setup and activate the Anaconda environment from a prompt (Anaconda3 Power Shell on Windows or Terminal.app on Mac)\n\n mamba config --add channels conda-forge\n mamba env create --name wisdem-env -f https://raw.githubusercontent.com/WISDEM/WISDEM/master/environment.yml python=3.10\n mamba activate wisdem-env\n\n2. In order to directly use the examples in the repository and peek at the code when necessary, we recommend all users install WISDEM in *developer / editable* mode using the instructions here. If you really just want to use WISDEM as a library and lean on the documentation, you can always do `conda install wisdem` and be done. Note the differences between Windows and Mac/Linux build systems. For Linux, we recommend using the native compilers (for example, gcc and gfortran in the default GNU suite).\n\n mamba install -y petsc4py mpi4py # (Mac / Linux only)\n mamba install -y gfortran # (Mac only without Homebrew or Macports compilers)\n mamba install -y m2w64-toolchain libpython # (Windows only)\n git clone https://github.com/WISDEM/WISDEM.git\n cd WISDEM\n python setup.py develop\t\t\t\t # Currently more reliable than: pip install -e\n\n\n**NOTE:** To use WISDEM again after installation is complete, you will always need to activate the conda environment first with `conda activate wisdem-env`\n\n\n## Run Unit Tests\n\nEach package has its own set of unit tests. These can be run in batch with the `test_all.py` script located in the top level `test`-directory.\n\n## Feedback\n\nFor software issues please use . For functionality and theory related questions and comments please use the NWTC forum for [Systems Engineering Software Questions](https://wind.nrel.gov/forum/wind/viewtopic.php?f=34&t=1002).\n'",,"2014/09/04, 20:30:24",3337,Apache-2.0,245,3597,"2023/10/17, 21:59:53",10,326,460,66,7,0,0.5,0.5296334361082562,"2023/09/27, 18:59:21",v3.11.1,0,22,false,,false,false,"DTUWindEnergy/hydesign,NREL/ROSCO_toolbox",,https://github.com/WISDEM,https://www.nrel.gov/wind/systems-engineering.html,"NREL National Wind Technology Center, Boulder, CO",,,https://avatars.githubusercontent.com/u/5444272?v=4,,,
WOMBAT,Windfarm Operations & Maintenance cost-Benefit Analysis Tool.,WISDEM,https://github.com/WISDEM/WOMBAT.git,github,"simulation,wind-energy,python3,simpy,operations-maintenance",Wind Energy,"2023/09/20, 23:32:26",12,2,3,true,Python,WISDEM,WISDEM,"Python,TeX",https://wisdem.github.io/WOMBAT/,"b'# WOMBAT: Windfarm Operations & Maintenance cost-Benefit Analysis Tool\n\n[](https://www.osti.gov/biblio/1894867)\n[](https://badge.fury.io/py/wombat)\n[](https://opensource.org/licenses/Apache-2.0)\n[](https://mybinder.org/v2/gh/WISDEM/WOMBAT/main?filepath=examples)\n[](https://wisdem.github.io/WOMBAT)\n\n[](https://github.com/pre-commit/pre-commit)\n[](https://github.com/psf/black)\n[](https://pycqa.github.io/isort/)\n\nThis library provides a tool to simulate the operation and maintenance phase (O&M) of\ndistributed, land-based, and offshore windfarms using a discrete event simultaion\nframework.\n\nWOMBAT is written around the [`SimPy`](https://gitlab.com/team-simpy/simpy) discrete\nevent simulation framework. Additionally, this is supported using a flexible and modular\nobject-oriented code base, which enables the modeling of arbitrarily large (or small)\nwindfarms with as many or as few failure and maintenance tasks that can be encoded.\n\nPlease note that this is still heavily under development, so you may find some functionality\nto be incomplete at the current moment, but rest assured the functionality is expanding.\nWith that said, it would be greatly appreciated for issues or PRs to be submitted for\nany improvements at all, from fixing typos (guaranteed to be a few) to features to\ntesting.\n\nIf you use this library please cite our NREL Technical Report:\n\n```bibtex\n @techreport{hammond2022wombat,\n title = {Windfarm Operations and Maintenance cost-Benefit Analysis Tool (WOMBAT)},\n author = {Hammond, Rob and Cooperman, Aubryn},\n abstractNote = {This report provides technical documentation and background on the newly-developed Wind Operations and Maintenance cost-Benefit Analysis Tool (WOMBAT) software. WOMBAT is an open-source model that can be used to obtain cost estimates for operations and maintenance of land-based or offshore wind power plants. The software was designed to be flexible and modular to allow for implementation of new strategies and technological innovations for wind plant maintenance. WOMBAT uses a process-based simulation approach to model day-to-day operations, repairs, and weather conditions. High-level outputs from WOMBAT, including time-based availability and annual operating costs, are found to agree with published results from other models.},\n doi = {10.2172/1894867},\n url = {https://www.osti.gov/biblio/1894867},\n place = {United States},\n year = {2022},\n month = {10},\n institution = {National Renewable Energy Lab. (NREL)},\n }\n```\n\n## WOMBAT in Action\n\nThere a few Jupyter notebooks to get users up and running with WOMBAT in the `examples/`\nfolder, but here are a few highlights:\n\n> **Note**\n> In v0.6 the results will diverge significantly under certain modeling conditions from\n> past versions due to substantial model upgrades on the backend and new/updated\n> features to better specify how repairs are managed.\n\n* Dinwoodie, et al. replication for `wombat` can be found in the\n `examples folder `_.\n* IEA Task 26\n `validation exercise `_.\n* Presentations: `slides `_.\n\n\n## Setup\n\n### Requirements\n\n* Python 3.8 through 3.10\n\n> **Note**\n> For Python 3.10 users that seek to install more than the base dependencies, it has\n> been noted that pip may take a long time to resolve all of the package requirements,\n> so it is recommended to use the following workflow:\n\n```console\n# Enter the source code directory\ncd wombat/\n\n# First install the base package requirements\npip install -e .\n\n# Then install whichever additional dependencies are required/desired\npip install -e \'.[dev]\' # \'.[docs]\' or \'.[all]\'\n```\n\n### Environment Setup\n\nDownload the latest version of [Miniconda](https://docs.conda.io/en/latest/miniconda.html)\nfor the appropriate OS. Follow the remaining\n[steps](https://conda.io/projects/conda/en/latest/user-guide/install/index.html#regular-installation)\nfor the appropriate OS version.\n\nUsing conda, create a new virtual environment:\n\n```console\nconda create -n python=3.8 --no-default-packages\nconda activate \nconda install -c anaconda pip\n\n# activate the environment\nconda activate \n\n# to deactivate\nconda deactivate\n```\n\n### Installation\n\n\n#### Pip\n\n```console\npip install wombat\n```\n\n#### From Source\n\nInstall it directly into an activated virtual environment:\n\n```console\ngit clone https://github.com/WISDEM/WOMBAT.git\ncd wombat\npython setup.py install\n\n# Alternatively:\npip install .\n```\n\n#### Usage\n\nAfter installation, the package can imported:\n\n```console\npython\nimport wombat\nwombat.__version__\n```\n\nFor further usage, please see the documentation site at https://wisdem.github.io/WOMBAT.\n\n\n### Requirements for Contributing to WOMBAT\n\n#### Code Contributions\n\nCode contributors should note that there is both an additional dependency suite for\nrunning the tests and enabling the pre-commit workflow to automically standardize the\ncore code formatting principles.\n\n```console\ngit clone https://github.com/WISDEM/WOMBAT.git\ncd wombat\n\n# Install the additional dependencies for running the tests and automatic code formatting\npip install -e \'.[dev]\'\n\n# Enable the pre-commit workflow for automatic code formatting\npre-commit install\n\n# ... contributions and commits ...\n\n# Run the tests and ensure they all pass\npytest tests\n```\n\nBasic pre-commit issues that users might encounter and their remedies:\n\n* For any failed run, changes may have been either automatically applied or require\n further edits from the contributor. In either case, after changes have been made,\n contributors will have to rerun `git add ` and\n `git commit -m ` to restart the pre-commit workflow with the\n applied changes. Once all checks pass, the commit is safe to be pushed.\n* `isort`, `black`, or simple file checks failed, but made changes\n * rerun the `add` and `commit` processes as needed until the changes satisfy the checks\n* `pylint` or `flake8` failed:\n * Address the errors and rerun the `add` and `commit` processes\n* `mypy` has type errors that seem incorrect\n * Double check the typing is in fact as correct as it seems it should be and rerun the\n `add` and `commit` processes\n * If `mypy` simply seems confused with seemingly correct types, the following statement\n can be added above the `mypy` error:\n `assert isinstance(, )`\n * If that\'s still not working, but you are definitely sure the types are correct,\n simply add a `# type ignore` comment at the end of the line. Sometimes `mypy` struggles\n with complex scenarios, or especially with certain `attrs` conventions.\n\n#### Documentation Contributions\n\n```console\ngit clone https://github.com/WISDEM/WOMBAT.git\ncd wombat\npip install -e \'.[docs]\'\n```\n\nBuild the site\n\n> **Note**\n> You may want to change the ""execute_notebook"" parameter in the `conf.py` file to\n> ""off"" unless you\'re updating the coded examples or they will be run every time you\n> build the site.\n\n```console\ncd docs/\nsphinx-build -b html source _build && make html\n```\n\nView the results: `docs/_build/html/index.html`\n\n#### Code and Documentation Contributions\n\n```console\ngit clone https://github.com/WISDEM/WOMBAT.git\ncd wombat\npip install -e \'.[all]\'\n```\n'",,"2021/04/19, 18:17:42",919,Apache-2.0,74,228,"2023/10/23, 19:58:06",15,97,107,71,2,1,0.0,0.10067114093959728,"2023/08/28, 18:30:33",v0.8.1,0,3,false,,false,false,"NREL/WAVES,sevstafiev/RaifHack2021",,https://github.com/WISDEM,https://www.nrel.gov/wind/systems-engineering.html,"NREL National Wind Technology Center, Boulder, CO",,,https://avatars.githubusercontent.com/u/5444272?v=4,,,
LandBOSSE,"The Land-based Balance-of-System Systems Engineering model is a systems engineering tool that estimates the balance-of-system costs associated with installing utility scale wind plants (10, 1.5 MW turbines or larger).",WISDEM,https://github.com/WISDEM/LandBOSSE.git,github,,Wind Energy,"2023/06/19, 22:12:07",16,0,3,true,Python,WISDEM,WISDEM,Python,,"b""# LandBOSSE\n\n## Welcome to LandBOSSE!\n\nThe Land-based Balance-of-System Systems Engineering (LandBOSSE) model is a systems engineering tool that estimates the balance-of-system (BOS) costs associated with installing utility scale wind plants (10, 1.5 MW turbines or larger). It can execute on macOS and Windows. At this time, for both platforms, it is a command line tool that needs to be accessed from the command line.\n\nThe methods used to develop this model (specifically, LandBOSSE Version 2.1.0) are described in greater detail the following report:\n\nEberle, Annika, Owen Roberts, Alicia Key, Parangat Bhaskar, and Katherine Dykes.\n2019. NREL\xe2\x80\x99s Balance-of-System Cost Model for Land-Based Wind. Golden, CO:\nNational Renewable Energy Laboratory. NREL/TP-6A20-72201.\nhttps://www.nrel.gov/docs/fy19osti/72201.pdf.\n\n## User Guides\n\nFirst, read the technical report to understand the big picture of LandBOSSE. In the technical report, you will find process diagrams, equations and the modules that implement them. Then, come back to this documentation and read the user guide.\n\nIn brief, LandBOSSE takes `.xlsx` spreadsheets, reads input data from tabs on the spreadsheets, and writes the results to an output `.xlsx` file. There are three sections in the user guide to demonstrate how to perform these steps.\n\nThe user guide comes in three parts:\n\n1. Software installation,\n\n2. Input data configuration, and\n\n3. Output data analysis.\n\n### Software Installation\n\nThere are two options depending on whether you are a developer or an end user and what operating system you are running.\n\n- **Windows end-user**: If you run the Microsoft Windows operating system and aren't setting up as a developer who is going to be modifying the core library, these instructions are for you. [Find out how to configure Windows for end users.](installation_instructions/windows_end_user.md)\n\n- **macOS end user** and **macOS developer**: If you run the macOS operating system, either as an end-user or as a developer, these instructions are for you. Both developers and end-users will need most of the steps. [Find out how to configure macOS for end users and developers.](installation_instructions/macos_developer.md)\n\n### Operation after the installation\n\nReview the installation instructions on how to activate a virtual environment, if you haven't already.\n\nThen, read the [Operation and Folder Structure](installation_instructions/operation_and_folder_structure.md) for details on running the command that executes LandBOSSE from the command line.\n""",,"2014/10/06, 20:54:57",3305,CUSTOM,18,618,"2023/06/19, 22:12:08",22,91,168,8,127,2,0.1,0.40529531568228105,"2023/06/19, 22:13:14",v2.5.0,0,7,false,,false,false,,,https://github.com/WISDEM,https://www.nrel.gov/wind/systems-engineering.html,"NREL National Wind Technology Center, Boulder, CO",,,https://avatars.githubusercontent.com/u/5444272?v=4,,,
OpenMDAO,Optimization of Aerodynamic systems.,OpenMDAO,https://github.com/OpenMDAO/OpenMDAO.git,github,"nasa,open-source,framework,openmdao,optimization",Wind Energy,"2023/10/23, 14:51:31",443,198,89,true,Python,OpenMDAO,OpenMDAO,"Python,Jupyter Notebook,JavaScript,HTML,CSS,TeX,Shell",http://openmdao.org,"b'[![GitHub Actions Test Badge][17]][18]\n[![Coveralls Badge][13]][14]\n[![PyPI version][10]][11]\n[![PyPI Monthly Downloads][12]][11]\n\n# [OpenMDAO][0]\n\nOpenMDAO is an open-source high-performance computing platform for\nsystems analysis and multidisciplinary optimization, written in Python.\nIt enables you to decompose your models, making them easier to build and\nmaintain, while still solving them in a tightly coupled manner with\nefficient parallel numerical methods.\n\nThe OpenMDAO project is primarily focused on supporting gradient-based\noptimization with analytic derivatives to allow you to explore large\ndesign spaces with hundreds or thousands of design variables, but the\nframework also has a number of parallel computing features that can\nwork with gradient-free optimization, mixed-integer nonlinear\nprogramming, and traditional design space exploration.\n\nIf you are using OpenMDAO, please [cite][20] us!\n\n## Documentation\n\nDocumentation for the latest version can be found [here][2].\n\nDocumentation archives for prior versions can be found [here][3].\n\n## Important Notice\n\nWhile the API is relatively stable, **OpenMDAO** remains in active development.\nThere will be periodic changes to the API.\nUser\'s are encouraged to pin their version of OpenMDAO to a recent release and\nupdate periodically.\n\n## Install OpenMDAO\n\nYou have two options for installing **OpenMDAO**, (1) from the\n[Python Package Index (PyPI)][1], and (2) from the [GitHub repository][4].\n\n**OpenMDAO** includes several optional sets of dependencies including:\n`test` for installing the developer tools (e.g., testing, coverage),\n`docs` for building the documentation and\n`visualization` for some extra visualization tools.\nSpecifying `all` will include all of the optional dependencies.\n\n### Install from [PyPI][1]\n\nThis is the easiest way to install **OpenMDAO**. To install only the runtime\ndependencies:\n\n pip install openmdao\n\nTo install all the optional dependencies:\n\n pip install openmdao[all]\n\n### Install from a Cloned Repository\n\nThis allows you to install **OpenMDAO** from a local copy of the source code.\n\n git clone http://github.com/OpenMDAO/OpenMDAO\n pip install OpenMDAO\n\nIf you would like to make changes to **OpenMDAO** it is recommended you\ninstall it in *[editable][16]* mode (i.e., development mode) by adding the `-e`\nflag when calling `pip`, this way any changes you make to the source code will\nbe included when you import **OpenMDAO** in *Python*. You will also want to\ninstall the packages necessary for running **OpenMDAO**\'s tests and documentation\ngenerator. You can install everything needed for development by running:\n\n pip install -e OpenMDAO[all]\n\n## OpenMDAO Versions\n\n**OpenMDAO 3.x.y** represents the current, supported version. It requires Python 3.7\nor later and is maintained [here][4]. To upgrade to the latest release, run:\n\n pip install --upgrade openmdao\n\n**OpenMDAO 2.10.x** was the last version to support Python 2.x and is no longer supported.\nTo install this older release, run:\n\n pip install ""openmdao<3""\n\n**OpenMDAO 1.7.4** was an earlier version of OpenMDAO and is also no longer supported.\nThe code repository is now named **OpenMDAO1**, and has moved [here][5]. To install it, run:\n\n pip install ""openmdao<2""\n\nThe legacy **OpenMDAO v0.x** (versions 0.13.0 and older) of the\n**OpenMDAO-Framework** are [here][6].\n\n## Test OpenMDAO\n\nUsers are encouraged to run the unit tests to ensure **OpenMDAO** is performing\ncorrectly. In order to do so, you must install the testing dependencies.\n\n1. Install **OpenMDAO** and its testing dependencies:\n\n `pip install openmdao[test]`\n\n > Alternatively, you can clone the repository, as explained\n [here](#install-from-a-cloned-repository), and install the development\n dependencies as described [here](#install-the-developer-dependencies).\n\n2. Run tests:\n\n `testflo openmdao -n 1`\n\n3. If everything works correctly, you should see a message stating that there\nwere zero failures. If the tests produce failures, you are encouraged to report\nthem as an [issue][7]. If so, please make sure you include your system spec,\nand include the error message.\n\n > If tests fail, please include your system information, you can obtain\n that by running the following commands in *python* and copying the results\n produced by the last line.\n\n import platform, sys\n\n info = platform.uname()\n (info.system, info.version), (info.machine, info.processor), sys.version\n\n > Which should produce a result similar to:\n\n ((\'Windows\', \'10.0.17134\'),\n (\'AMD64\', \'Intel64 Family 6 Model 94 Stepping 3, GenuineIntel\'),\n \'3.6.6 | packaged by conda-forge | (default, Jul 26 2018, 11:48:23) ...\')\n\n## Build the Documentation for OpenMDAO\n\nDocumentation for the latest version can always be found [here][2], but if you would like to build a local copy you can find instructions to do so [here][19].\n\n[0]: http://openmdao.org/ ""OpenMDAO""\n[1]: https://pypi.org/project/openmdao/ ""OpenMDAO @PyPI""\n\n[2]: http://openmdao.org/newdocs/versions/latest ""Latest Docs""\n[3]: http://openmdao.org/docs ""Archived Docs""\n\n[4]: https://github.com/OpenMDAO/OpenMDAO ""OpenMDAO Git Repo""\n[5]: https://github.com/OpenMDAO/OpenMDAO1 ""OpenMDAO 1.x Git Repo""\n[6]: https://github.com/OpenMDAO/OpenMDAO-Framework ""OpenMDAO Framework Git Repo""\n\n[7]: https://github.com/OpenMDAO/OpenMDAO/issues/new ""Make New OpenMDAO Issue""\n\n[8]: https://help.github.com/articles/changing-a-remote-s-url/ ""Update Git Remote URL""\n\n[10]: https://badge.fury.io/py/openmdao.svg ""PyPI Version""\n[11]: https://badge.fury.io/py/openmdao ""OpenMDAO @PyPI""\n\n[12]: https://img.shields.io/pypi/dm/openmdao ""PyPI Monthly Downloads""\n\n[13]: https://coveralls.io/repos/github/OpenMDAO/OpenMDAO/badge.svg?branch=master ""Coverage Badge""\n[14]: https://coveralls.io/github/OpenMDAO/OpenMDAO?branch=master ""OpenMDAO @Coveralls""\n\n[15]: https://en.wikipedia.org/wiki/Software_release_life_cycle#Beta ""Wikipedia Beta""\n\n[16]: https://setuptools.readthedocs.io/en/latest/setuptools.html#development-mode ""Pip Editable Mode""\n\n[17]: https://github.com/OpenMDAO/OpenMDAO/actions/workflows/openmdao_test_workflow.yml/badge.svg ""Github Actions Badge""\n[18]: https://github.com/OpenMDAO/OpenMDAO/actions ""Github Actions""\n\n[19]: http://openmdao.org/newdocs/versions/latest/other_useful_docs/developer_docs/doc_build.html\n\n[20]: https://openmdao.org/newdocs/versions/latest/other/citing.html\n'",,"2016/08/25, 15:53:25",2617,CUSTOM,1406,17261,"2023/10/23, 14:51:44",63,2173,2992,404,2,2,1.4,0.6629639664047684,,,0,52,false,,false,true,"akshatd/umich-aerosp588,mid2SUPAERO/LCA4MDAO,kanekosh/Dymos_parallel_analysis,h1he2li3/MS-pycycle,dmkeijzer/AetheriaPackage,ovidner/openmdao-catia,ovidner/facit,Felix-Deng/Aircraft-Tire-Selector,jsfriedman/explore-topfarm,whatsopt/WhatsOpt-CLI,tuckerbabcock/E2M2,burn-research/OpenMEASURE,jbussemaker/OpenTurbofanArchitecting,MariusLRuh/lsdo_atmos,MariusLRuh/lift_plus_cruise_weights,JeremyDecroix/pyCycleMDP_PW1133,sherschlock/ISAE-Supaero,Team-Breakfast-Analytica/breakfast,aravindvenkatachalapathy/WIsDEM,albertoprocacci/OpenMEASURE,Scorpion-DX/bag2_test_ff,davidtls/FAST-GA,irfan-gh/FAST-OAD,666FAUST666/Mechatronics_Project_MERCIER-NOURMAMOD,antSGS/FAST-OAD_CS25,rparello/FAST-OAD,KaurCharan/FAST-GA,irfan-gh/FAST-GA,relf/FAST-OAD,marcowijaya/FAST-GA,NOTaShinigami/FAST-GA,marcowijaya/FAST-OAD,SharmaAnubhuti/FAST-GA,Yash-Malandkar/FAST-GA,fomra/FAST-GA,MRn0bod1/FAST-OAD_gui,RB-E/FAST-OAD,florentLutz/FAST-OAD_CS25,davidtls/FAST-OAD_CS25,hugogarcia06/FAST-OAD_CS25,nthepaut/FAST-OAD,esnguyenvan/FAST-OAD,ChaogeCanFly/FAST-OAD,florentLutz/FAST-GA-1,RemyCharayron/FAST-OAD,areysset/FAST-OAD-1,csar-on/FAST-OAD,jnain/FAST-OAD,PeterDezy/FAST-OAD,christophe-david/FAST-OAD,Iamkiller233/FAST-OAD,areysset/FAST-OAD,daptablade/docs,WISDEM/GeneratorSE,DeepFriedDerp/WEIS-DAC_bb_featuredevelopment,joshdowdell1/openconcept,AOE-khkhan/FAST-OAD,niftylab/BAG_framework,daoos/dymos-pre-release,DTUWindEnergy/hydesign,SizingLab/multirotor_sizing_isae_coa_2023_student_version,ElieKadoche/configuration_files,jexalto/ThesisCode,SizingLab/sizing_course,PrincetonUniversity/FAROES,zhemez/ORBIT_NREL,LSDOlab/lsdo_rotor,fast-aircraft-design/FAST-OAD,LSDOlab/ozone,SnowJeffSnow/test.1,SnowJeffSnow/test,nichco/tc1-stability,PlasmaControl/FAROES,Quentief/Rocket,kanekosh/eVTOL_sizing,Creelle/FAST-OAD,Tangxiaotian11/SEM,malmakova-na/HGV_model,tuckerbabcock/MotorModel,dingraha/omjl,mihir0210/WINDOW_static,WISDEM/LTS,rseng/rsepedia-analysis,berlinexpress174/openconcept_winter,OpenMDAO/mphys,ovidner/openmdao-deap,florentLutz/FAST-GA2-MODELS,daptablade/parametric_cgx_model,FredLin0421/4thtubes_ctr,WISDEM/CCBlade,OpenMDAO/zappy,MIT-LAE/pyNA,Atif-Aerospace/OpenAeroStructModels,fast-aircraft-design/FAST-OAD_CS25,HMDomingues/TrackTrajectoryOptimizer,Atif-Aerospace/Aero,Atif-Aerospace/WindTurbine,SizingLab/multirotor_sizing_isae_coa_2022_student_version,astridwalle/doe_visualisation_dashboard,kittyofheaven/Real-Truth-or-Dare,joeyji123/Training,fast-aircraft-design/FAST-OAD_notebooks,SEhrm/sim-comp,kejacobson/om_dash,NREL/WindSE,supaero-aircraft-design/FAST-GA,Bossgb97/CCBlade,sumukhbhoopalam/Covid-19_SAT,LSDOlab/csdl_om,UCSDMorimotoLab/CTRoptimization,anugrahjo/atomics_lite,LSDOlab/omtools,abcdefg781/f1.github.io,Mith15/ECE496_2021,LSDOlab/atomics,jiy352/doc_test,anugrahjo/array_manager,dingraha/CCBladeOpenMDAOExamples,mid2SUPAERO/RP_MAE_VictorGUADANO,ShinsukeSakai0321/Zassou,ShinsukeSakai0321/Image-Web-Apli,WISDEM/Plant_FinanceSE,Rlmuniz/toa,WISDEM/Turbine_CostsSE,NitroCortex/aerostructures,Partmedia/bag,LSDOlab/lsdo_cubesat,vgucsd/cubesat,ovidner/openmdao-nsga,mid2SUPAERO/ecoHALE,JustinSGray/path_dependent_missions,ovidner/openmdao-scop,sdaudlin/BAG_framework,WISDEM/FloatingSE,WISDEM/TowerSE,MAE155B-Group-3-SP20/Group3Repo,ocakgun/BAG_framework,WISDEM/WEIS,jcchin/boring,jcchin/RadialMotorDesign,Kenneth-T-Moore/AMIEGO,git-it/pulse_cam,johnjasa/multifidelity_studies,johnjasa/nrel_openmdao_extensions,fmsnew/nas-bench-nlp-release,Team-Neurons/Covid-Doctor,shubhamgoel90/Covid-Doctor,kushaggarwal/Covid-Doctor,byuflowlab/CCBlade.jl,metamorph-inc/run_mdao,thearn/pandemic,onodip/OpenMDAO-XDSM,WISDEM/ORBIT,naylor-b/om_devtools,bluecheetah/bag,bbrelje/jenkins-github-ci-test,DARcorporation/airfoil-optimizer,DARcorporation/nsde,ucb-art/BAG_framework,DARcorporation/airfoil-optimizer-web-ui,OpenMDAO/pyCycle,ovidner/openmdao-bridge-excel,ovidner/openmdao-bridge-matlab,byuflowlab/OpenMDAO.jl,ovidner/openmdao-omt,metamorph-inc/openmdao-csv-driver,metamorph-inc/conductor-mdao,edwardcwang/BAG_framework,WISDEM/WISDEM,byuflowlab/gaussian-wake,whatsopt/openmdao_extensions,William-Metz/OASFORNEURALNETWORKS,mid2SUPAERO/aerostructures,daniel-de-vries/fortran-mdao,DTUWindEnergy/TopFarm2,ovidner/staircase-optigurator,mid2SUPAERO/StaticAeroelasticity-MDO-IACONO,byuflowlab/stanley2018-turbine-design,OpenMDAO/library_template,sairajk/Image-Super-Resolution-Application,mdolab/OpenAeroStruct,OpenMDAO/dymos,mushtaq96/8semproject,TOPFARM-Wind/tutorial_openmdao2,mdolab/openconcept,bbrelje/travis_tutorial,rethore/FUSED-meteor,metamorph-inc/matlab_wrapper,OpenMDAO/flops_wrapper,daniel-de-vries/OpenLEGO,jarrodsinclair/doegen,metamorph-inc/fmu_wrapper,metamorph-inc/bayesopt_openmdao,jcchin/MagnePlane,OpenMDAO/CADRE,byuflowlab/PlantEnergy,OpenMDAO/NRELTraining,naylor-b/aserver",,https://github.com/OpenMDAO,http://www.openmdao.org,,,,https://avatars.githubusercontent.com/u/861615?v=4,,,
TopFarm2,A Python package developed by DTU Wind Energy to help with wind-farm optimizations.,TOPFARM,,custom,,Wind Energy,,,,,,,,,,https://gitlab.windenergy.dtu.dk/TOPFARM/TopFarm2,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
BasicDTUController,"The scope of this project is to provide an open source, open access controller that can be used by the wind energy community as a reference.",OpenLAC,,custom,,Wind Energy,,,,,,,,,,https://gitlab.windenergy.dtu.dk/OpenLAC/BasicDTUController,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
WindEnergyToolbox,"A collection of Python scripts that facilitate working with (potentially a lot) of HAWC2,HAWCStab2, FAST or other text input based simulation tools.",toolbox,,custom,,Wind Energy,,,,,,,,,,https://gitlab.windenergy.dtu.dk/toolbox/WindEnergyToolbox,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
windfarmGA,Genetic algorithm to optimize the layout of wind farms.,YsoSirius,https://github.com/YsoSirius/windfarmGA.git,github,,Wind Energy,"2023/04/30, 22:11:24",26,0,6,true,R,,,"R,C++",https://ysosirius.github.io/windfarmGA/,"b'# windfarmGA\n\n\n
\n
\n\n\n[](https://www.r-pkg.org/pkg/windfarmGA)\n\n[](https://github.com/YsoSirius/windfarmGA/actions)\n[](https://lifecycle.r-lib.org/articles/stages.html)\n[](https://app.codecov.io/gh/YsoSirius/windfarmGA)\n\n\n\n\nGenetic algorithm to optimize the layout of windfarms.\nThe package is hosted on [CRAN](https://CRAN.R-project.org/package=windfarmGA)\n\n# Installation\nThe latest version can be installed from GitHub with:\n```sh\ndevtools::install_github(""YsoSirius/windfarmGA"")\n```\n\nand the CRAN-version with:\n```sh\ninstall.packages(""windfarmGA"")\n```\n\n# Description\nThe genetic algorithm is designed to optimize wind farms of any shape.\nIt requires a predefined number of turbines, a uniform rotor radius and \nan average wind speed per wind direction.\nIt can include a terrain effect model, which downloads an \n\'SRTM\' elevation model and a \'Corine Land Cover\' raster automatically. The elevation \nmodel is used to find mountains and valleys and to adjust the \nwind speeds accordingly by \'wind multipliers\' and to determine \nthe air densities at rotor heights. The land cover raster with an additional elevation\nroughness value is used to re-evaluate the surface roughness and to individually\ndetermine the wake-decay constant for each turbine.\n\nTo start an optimization, either the function `windfarmGA` or `genetic_algorithm` can \nbe used. The function `windfarmGA` checks the user inputs interactively and then \nruns the function `genetic_algorithm`. If the input parameters are already known, an \noptimization can be run directly via `genetic_algorithm`. \n\n\n
![]()
\n
![]()
\n
\n\nSince version 1.1, hexagonal grid cells are possible, with \ntheir center points being possible locations for wind turbines. \nFurthermore, rasters can be included, which contain information on the Weibull \nparameters. For Austria this data is already included in the package. \n \n## Create an input Polygon\n- Input Polygon by source\n```sh\nlibrary(sf)\ndsn <- ""Path to the Shapefile""\nlayer <- ""Name of the Shapefile""\nPolygon1 <- sf::st_read(dsn = dsn, layer = layer)\nplot(Polygon1, col = ""blue"")\n```\n\n- Or create a random Polygon\n```sh\nlibrary(sf)\nPolygon1 <- sf::st_as_sf(sf::st_sfc(\n sf::st_polygon(list(cbind(\n c(0, 0, 2000, 2000, 0),\n c(0, 2000, 2000, 0, 0)))),\n crs = 3035\n))\nplot(Polygon1, col = ""blue"", axes = TRUE)\n```\n\n## Create random Wind data \n- Exemplary input Wind data with *uniform* wind speed and *single* wind direction\n```sh\nwind_df <- data.frame(ws = c(12, 12), wd = c(0, 0), probab = c(25, 25))\nwindrosePlot <- plot_windrose(data = wind_df, spd = wind_df$ws,\n dir = wind_df$wd, dirres=10, spdmax = 20)\n```\n\n- Exemplary input Wind data with *random* wind speeds and *random* wind directions\n```sh\nwind_df <- data.frame(ws = sample(1:25, 10), wd = sample(1:260, 10)))\nwindrosePlot <- plot_windrose(data = wind_df, spd = wind_df$ws,\n dir = wind_df$wd)\n```\n\n## Grid Spacing\n### Rectangular Grid Cells\nVerify that the grid spacing is appropriate. Adapt the following input variables if necessary:\n- *Rotor*: The rotor radius in meters.\n- *fcrR*: The grid spacing factor, which should at least be 2, so that a single grid covers at least the whole rotor diameter.\n- *prop*: The proportionality factor used for grid calculation. It determines the minimum percentage that a grid cell must cover of the area.\n\n*Make sure that the Polygon is projected in meters.*\n```sh\nRotor <- 20\nfcrR <- 9\nGrid <- grid_area(Polygon1, size = (Rotor * fcrR), prop = 1, plotGrid = TRUE)\nstr(Grid)\n```\n### Hexagonal Grid Cells\n```sh\nRotor <- 20\nfcrR <- 9\nHexGrid <- hexa_area(Polygon1, size = (Rotor * fcrR), plotGrid = TRUE)\nstr(HexGrid)\n```\n\n
\n
\n\n\n## Terrain Effect Model\nIf the input variable **topograp** for the functions `windfarmGA` or `genetic_algorithm` is TRUE, the genetic algorithm will take terrain effects into account. For this purpose an elevation model and a Corine Land Cover raster are downloaded automatically, but can also be given manually. ( [Download a CLC raster](https://www.eea.europa.eu/data-and-maps/data/clc-2006-raster-4) ).\n \n\nIf you want to include your own Land Cover Raster, you must assign the Raster Image path to the input variable **sourceCCL**. The algorithm uses an adapted version of the Raster legend (""clc_legend.csv""), which is stored in the package subdirectory (/extdata). To use own values for the land cover roughness lengths, insert a column named **Rauhigkeit_z** to the .csv file. Assign a surface roughness length to all land cover types. \nBe sure that all rows are filled with numeric values and save the .csv file with "";"" delimiter. Assign the .csv file path to the input variable **sourceCCLRoughness**.\n\n\n## Start an Optimization\nAn optimization run can be initiated with the following functions: \n- genetic_algorithm\n- windfarmGA\n\n### Function calls for windfarmGA\n- without terrain effects\n```sh\nresult <- windfarmGA(Polygon1 = Polygon1, n = 12, Rotor = 20, fcrR = 9, iteration = 10,\n vdirspe = wind_df, crossPart1 = ""EQU"", selstate = ""FIX"", mutr = 0.8,\n Proportionality = 1, SurfaceRoughness = 0.3, topograp = FALSE,\n elitism =TRUE, nelit = 7, trimForce = TRUE,\n referenceHeight = 50, RotorHeight = 100)\n```\n\n- with terrain effects\n```sh\nsourceCCL <- ""Source of the CCL raster (TIF)""\nsourceCCLRoughness <- ""Source of the Adaped CCL legend (CSV)""\n\nresult <- windfarmGA(Polygon1 = Polygon1, n = 12, Rotor = 20, fcrR = 9, iteration = 10,\n vdirspe = wind_df, crossPart1 = ""EQU"", selstate = ""FIX"", mutr = 0.8,\n Proportionality = 1, SurfaceRoughness = 0.3, topograp = TRUE,\n elitism = TRUE, nelit = 7, trimForce = TRUE,\n referenceHeight = 50, RotorHeight = 100, sourceCCL = sourceCCL,\n sourceCCLRoughness = sourceCCLRoughness)\n```\n\n### Function calls for genetic_algorithm\n- without terrain effects\n```sh\nresult <- genetic_algorithm(Polygon1 = Polygon1, n = 12, Rotor = 20, fcrR = 9, iteration = 10,\n vdirspe = wind_df, crossPart1 = ""EQU"", selstate = ""FIX"", mutr =0.8,\n Proportionality = 1, SurfaceRoughness = 0.3, topograp = FALSE,\n elitism = TRUE, nelit = 7, trimForce = TRUE,\n referenceHeight = 50, RotorHeight = 100)\n```\n\n- with terrain effects\n```sh\nsourceCCL <- ""Source of the CCL raster (TIF)""\nsourceCCLRoughness <- ""Source of the Adaped CCL legend (CSV)""\nresult <- genetic_algorithm(Polygon1 = Polygon1, n= 12, Rotor = 20, fcrR = 9, iteration = 10,\n vdirspe = wind_df, crossPart1 = ""EQU"", selstate = ""FIX"", mutr = 0.8,\n Proportionality = 1, SurfaceRoughness = 0.3, topograp = TRUE,\n elitism = TRUE, nelit = 7, trimForce = TRUE,\n referenceHeight = 50, RotorHeight = 100, sourceCCL = sourceCCL,\n sourceCCLRoughness = sourceCCLRoughness)\n```\n\n```sh\n## Run an optimization with your own Weibull parameter rasters. The shape and scale \n## parameter rasters of the weibull distributions must be added to a list, with the first\n## list item being the shape parameter (k) and the second list item being the scale\n## parameter (a). Adapt the paths to your raster data and run an optimization.\nkraster <- ""/..pathto../k_param_raster.tif""\naraster <- ""/..pathto../a_param_raster.tif""\nweibullrasters <- list(raster(kraster), raster(araster))\n\nresult_weibull <- genetic_algorithm(Polygon1 = Polygon1, GridMethod =""h"", n=12,\n fcrR=5, iteration=10, vdirspe = wind_df, crossPart1 = ""EQU"",\n selstate=""FIX"", mutr=0.8, Proportionality = 1, Rotor=30,\n SurfaceRoughness = 0.3, topograp = FALSE,\n elitism=TRUE, nelit = 7, trimForce = TRUE,\n referenceHeight = 50,RotorHeight = 100,\n weibull = TRUE, weibullsrc = weibullrasters)\nplot_windfarmGA(result = result_weibull, Polygon1 = Polygon1)\n```\nThe argument **GridMethod**, **weibull**, **weibullsrc** can also be given to the function `windfarmGA`.\n\n#### Plot the Results on a Leaflet Map\n```sh\n## Plot the best wind farm on a leaflet map (ordered by energy values)\nplot_leaflet(result = resulthex, Polygon1, which = 1)\n\n## Plot the last wind farm (ordered by chronology).\nplot_leaflet(result = resulthex, Polygon1, orderitems = FALSE, which = 1)\n```\n\n## Plotting Methods of the Genetic Algorithm \nSeveral plotting functions are available:\n```sh\n - plot_windfarmGA(result, Polygon1)\n - plot_result(result, Polygon1, best = 1)\n - plot_evolution(result, ask = TRUE, spar = 0.1)\n - plot_development(result)\n - plot_parkfitness(result, spar = 0.1)\n - plot_fitness_evolution(result)\n - plot_cloud(result, pl = TRUE)\n - plot_heatmap(result = result, si = 5)\n - plot_leaflet(result = result, Polygon1 = Polygon1, which = 1)\n```\n\nA full documentation of the genetic algorithm is given in my [master thesis](https://homepage.boku.ac.at/jschmidt/TOOLS/Masterarbeit_Gatscha.pdf).\n\n# Shiny Windfarm Optimization\nI also made a [Shiny App](https://windfarmga.shinyapps.io/windga_shiny/) for the Genetic Algorithm. \nUnfortunately, as an optimization takes quite some time and the app is currently hosted by shinyapps.io under a public license, there is only 1 R-worker at hand. So only 1 optimization can be run at a time. \n\n# Full Optimization example:\n```sh\nlibrary(sf)\nlibrary(windfarmGA)\n\nPolygon1 <- sf::st_as_sf(sf::st_sfc(\n sf::st_polygon(list(cbind(\n c(4651704, 4651704, 4654475, 4654475, 4651704),\n c(2692925, 2694746, 2694746, 2692925, 2692925)))), \n crs = 3035\n))\nplot(Polygon1, col = ""blue"", axes = TRUE)\n\nwind_df <- data.frame(ws = 12, wd = 0)\nwindrosePlot <- plot_windrose(data = wind_df, spd = wind_df$ws,\n dir = wind_df$wd, dirres = 10, spdmax = 20)\nRotor <- 20\nfcrR <- 9\nGrid <- grid_area(shape = Polygon1, size = (Rotor*fcrR), prop = 1, plotGrid = TRUE)\n\nresult <- genetic_algorithm(Polygon1 = sp_polygon, \n n = 20,\n Rotor = Rotor, fcrR = fcrR, \n iteration = 50, \n vdirspe = wind_df,\n referenceHeight = 50, RotorHeight = 100)\n\n# The following function will execute all plotting function further below:\nplot_windfarmGA(result, Polygon1, whichPl = ""all"", best = 1, plotEn = 1)\n\n# The plotting functions can also be called individually:\nplot_result(result, Polygon1, best = 1, plotEn = 1, topographie = FALSE)\nplot_evolution(result, ask = TRUE, spar = 0.1)\nplot_parkfitness(result, spar = 0.1)\nplot_fitness_evolution(result)\nplot_cloud(result, pl = TRUE)\nplot_heatmap(result = result, si = 5)\nplot_leaflet(result = result, Polygon1 = Polygon1, which = 1)\n```\n'",,"2017/02/17, 16:56:52",2441,CUSTOM,18,390,"2023/05/02, 07:10:37",4,25,32,6,176,1,0.0,0.4828571428571429,"2021/05/06, 12:45:22",3.0.0,0,2,false,,false,false,,,,,,,,,,,
wtphm,"The Wind Turbine Prognostics and Health Management library processes wind turbine events data, as well as operational SCADA data for easier fault detection, prognostics or reliability research.",lkev,https://github.com/lkev/wtphm.git,github,"wind-turbine,wind-energy,fault-detection,machine-learning,scada",Wind Energy,"2021/01/07, 17:46:26",53,0,12,false,Python,,,Python,,"b'.. comment\n\nWTPHM\n*****\n\nThe **W**\\ind **T**\\urbine **P**\\rognostics and **H**\\ealth **M**\\anagement library\nprocesses wind turbine events (also called alarms or status) data, as well as\noperational SCADA data (the usually 10-minute data coming off of wind turbines)\nfor easier fault detection, prognostics or reliability research.\n\nTurbine alarms often appear in high numbers during fault events, and significant\neffort can be involved in processing these alarms in order to find what actually\nhappened, what the root cause was, and when the turbine came back online.\nThis module solves this by automatically identifying stoppages and fault periods\nin the data and assigning a high-level ""stoppage category"" to each.\nIt also provides functionality to use this info to label SCADA data for training\npredictive maintenance algorithms.\n\nAlthough there are commercial packages that can perform this task, this library\naims to be an open-source alternative for use by the research community.\n\nPlease reference this repo if used in any research. Any bugs, questions or\nfeature requests can be raised on GitHub. Can also reach me on twitter\n@leahykev.\n\nThis library was used to build the ""batch creation"" and ""data labelling"" steps of `this paper `_.\n\nInstallation\n============\n\nInstall using pip! ::\n\n pip install wtphm\n\nDocumentation\n=============\n\nFull documentation and user guide can be found on\n`readthedocs `_.\n\nA local copy of the docs can\nbe built by running ``_ with sphinx installed.\n\nIs my Data Compatible?\n======================\n\nThe data manipulated in this library are turbine events/status/alarms data and\n10-minute operational SCADA data.\nThey must be in the formats described below.\n\nEvent Data\n----------\n\n.. start event comment\n\nThe ``event_data`` is related to any fault or information messages generated by\nthe turbine. This is instantaneous, and records information like faults that have\noccurred, or status messages like low- or no- wind, or turbine shutting down due\nto storm winds.\n\nThe data must have the following column headers and information available:\n\n* ``turbine_num``: The turbine the data applies to\n* ``code``: There are a set list of events which can occur on the\n turbine. Each one of these has an event code\n* ``description``: Each event code also has an associated description\n* ``time_on``: The start time of the event\n* ``stop_cat``: This is a category for events which cause the turbine to come to\n a stop. It could be the functional location of where in the turbine the event\n originated (e.g. pitch system), a category for grid-related events,\n that the turbine is down for testing or maintenance, in curtailment due to\n shadow flicker, etc.\n* In addition, there must be a specific event ``code`` which signifies return to\n normal operation after any downtime or abnormal operating period.\n\n.. end event comment\n\nSCADA/Operational data\n----------------------\n\n.. start scada comment\n\nThe ``scada_data`` is typically recorded in 10-minute intervals and has attributes like\naverage power output, maximum, minimum and average windspeeds, etc. over the previous\n10-minute period.\n\nFor the purposes of this library, it must have the following column headers and\ndata:\n\n* ``turbine_num``: The turbine the data applies to\n* ``time``: The 10-minute period the data belongs to\n* availability counters: Some of the functions for giving the batches a stop\n category rely on availability counters. These are sometimes stored as part of\n scada data, and sometimes in separate availability data. They count the portion\n of time the turbine was in some mode of operation in each 10-minute period,\n for availability calculations. For example, maintenance time, fault time, etc.\n In order to be used in this library, the availability counters are\n assumed to range between 0 and\n *n* in each period, where *n* is some arbitrary maximum (typically 600, for\n the 600 seconds in the 10-minute period).\n\n.. end scada comment\n'",,"2018/08/22, 18:39:45",1890,GPL-3.0,0,70,"2020/06/20, 14:58:49",2,1,3,0,1222,0,0.0,0.0,,,0,1,false,,false,false,,,,,,,,,,,
AirfoilPreppy,"A Python module for pre-processing and evaluating aerodynamic airfoil data, primarily for wind turbine applications.",WISDEM,https://github.com/WISDEM/AirfoilPreppy.git,github,,Wind Energy,"2023/10/22, 02:40:21",31,0,4,true,Python,WISDEM,WISDEM,Python,,"b'# Airfoil Preppy\n\nA Python module for preprocessing and evaluating aerodynamic airfoil data---primarily for wind turbine applications.\n\nAuthor: [NREL WISDEM Team](mailto:systems.engineering@nrel.gov) \n\n## Documentation\n\nFor detailed documentation see \n\n## Installation\n\nFor detailed installation instructions of WISDEM modules see or to install AirfoilPreppy by itself do:\n\n $ python setup.py install\n\n## Unit Tests\n\nTo check for successful installation, run the unit tests\n\n $ python test/test_airfoilprep.py\n\nFor software issues please use . For functionality and theory related questions and comments please use the NWTC forum for [Systems Engineering Software Questions](https://wind.nrel.gov/forum/wind/viewtopic.php?f=34&t=1002).\n\n'",,"2013/12/13, 22:21:11",3602,CUSTOM,1,44,"2020/06/11, 20:16:59",1,10,19,0,1230,0,0.0,0.6470588235294117,,,0,7,false,,false,false,,,https://github.com/WISDEM,https://www.nrel.gov/wind/systems-engineering.html,"NREL National Wind Technology Center, Boulder, CO",,,https://avatars.githubusercontent.com/u/5444272?v=4,,,
GreenGuard,A collection of end-to-end solutions for machine learning problems commonly found in monitoring wind energy production system.,signals-dev,https://github.com/sintel-dev/Draco.git,github,"classification,machine-learning,time-series",Wind Energy,"2023/07/31, 15:36:01",49,0,13,true,Jupyter Notebook,The Signal Intelligence Project,sintel-dev,"Jupyter Notebook,Python,Makefile,Dockerfile",https://sintel-dev.github.io/Draco,"b'\n
\nAn open source project from Data to AI Lab at MIT.\n
\n\n\n
\n
\n\n\nAutoML for Time Series.\n
\n\n\n[](https://pypi.python.org/pypi/draco-ml)\n[](https://github.com/sintel-dev/Draco/actions?query=workflow%3A%22Run+Tests%22+branch%3Amaster)\n[](https://pepy.tech/project/draco-ml)\n[](https://mybinder.org/v2/gh/sintel-dev/Draco/master?filepath=tutorials)\n\n\n# Draco\n\n- License: [MIT](https://github.com/sintel-dev/Draco/blob/master/LICENSE)\n- Documentation: https://sintel-dev.github.io/Draco\n- Homepage: https://github.com/sintel-dev/Draco\n\n## Overview\n\nThe Draco project is a collection of end-to-end solutions for machine learning problems\ncommonly found in time series monitoring systems. Most tasks utilize sensor data\nemanating from monitoring systems. We utilize the foundational innovations developed for\nautomation of machine Learning at Data to AI Lab at MIT.\n\nThe salient aspects of this customized project are:\n\n* A set of ready to use, well tested pipelines for different machine learning tasks. These are\n vetted through testing across multiple publicly available datasets for the same task.\n* An easy interface to specify the task, pipeline, and generate results and summarize them.\n* A production ready, deployable pipeline.\n* An easy interface to ``tune`` pipelines using Bayesian Tuning and Bandits library.\n* A community oriented infrastructure to incorporate new pipelines.\n* A robust continuous integration and testing infrastructure.\n* A ``learning database`` recording all past outcomes --> tasks, pipelines, outcomes.\n\n## Resources\n\n* [Data Format](DATA_FORMAT.md).\n* [Draco folder structure](DATA_FORMAT.md#folder-structure).\n\n# Install\n\n## Requirements\n\n**Draco** has been developed and runs on Python 3.6, 3.7 and 3.8.\n\nAlso, although it is not strictly required, the usage of a [virtualenv](\nhttps://virtualenv.pypa.io/en/latest/) is highly recommended in order to avoid interfering\nwith other software installed in the system where you are trying to run **Draco**.\n\n## Download and Install\n\n**Draco** can be installed locally using [pip](https://pip.pypa.io/en/stable/) with\nthe following command:\n\n```bash\npip install draco-ml\n```\n\nThis will pull and install the latest stable release from [PyPi](https://pypi.org/).\n\nIf you want to install from source or contribute to the project please read the\n[Contributing Guide](https://sintel-dev.github.io/Draco/contributing.html#get-started).\n\n# Data Format\n\nThe minimum input expected by the **Draco** system consists of the following two elements,\nwhich need to be passed as `pandas.DataFrame` objects:\n\n## Target Times\n\nA table containing the specification of the problem that we are solving, which has three\ncolumns:\n\n* `turbine_id`: Unique identifier of the turbine which this label corresponds to.\n* `cutoff_time`: Time associated with this target\n* `target`: The value that we want to predict. This can either be a numerical value or a\n categorical label. This column can also be skipped when preparing data that will be used\n only to make predictions and not to fit any pipeline.\n\n| | turbine_id | cutoff_time | target |\n|----|--------------|---------------------|----------|\n| 0 | T1 | 2001-01-02 00:00:00 | 0 |\n| 1 | T1 | 2001-01-03 00:00:00 | 1 |\n| 2 | T2 | 2001-01-04 00:00:00 | 0 |\n\n## Readings\n\nA table containing the signal data from the different sensors, with the following columns:\n\n * `turbine_id`: Unique identifier of the turbine which this reading comes from.\n * `signal_id`: Unique identifier of the signal which this reading comes from.\n * `timestamp (datetime)`: Time where the reading took place, as a datetime.\n * `value (float)`: Numeric value of this reading.\n\n| | turbine_id | signal_id | timestamp | value |\n|----|--------------|-------------|---------------------|---------|\n| 0 | T1 | S1 | 2001-01-01 00:00:00 | 1 |\n| 1 | T1 | S1 | 2001-01-01 12:00:00 | 2 |\n| 2 | T1 | S1 | 2001-01-02 00:00:00 | 3 |\n| 3 | T1 | S1 | 2001-01-02 12:00:00 | 4 |\n| 4 | T1 | S1 | 2001-01-03 00:00:00 | 5 |\n| 5 | T1 | S1 | 2001-01-03 12:00:00 | 6 |\n| 6 | T1 | S2 | 2001-01-01 00:00:00 | 7 |\n| 7 | T1 | S2 | 2001-01-01 12:00:00 | 8 |\n| 8 | T1 | S2 | 2001-01-02 00:00:00 | 9 |\n| 9 | T1 | S2 | 2001-01-02 12:00:00 | 10 |\n| 10 | T1 | S2 | 2001-01-03 00:00:00 | 11 |\n| 11 | T1 | S2 | 2001-01-03 12:00:00 | 12 |\n\n## Turbines\n\nOptionally, a third table can be added containing metadata about the turbines.\nThe only requirement for this table is to have a `turbine_id` field, and it can have\nan arbitraty number of additional fields.\n\n| | turbine_id | manufacturer | ... | ... | ... |\n|----|--------------|----------------|-------|-------|-------|\n| 0 | T1 | Siemens | ... | ... | ... |\n| 1 | T2 | Siemens | ... | ... | ... |\n\n## CSV Format\n\nA part from the in-memory data format explained above, which is limited by the memory\nallocation capabilities of the system where it is run, **Draco** is also prepared to\nload and work with data stored as a collection of CSV files, drastically increasing the amount\nof data which it can work with. Further details about this format can be found in the\n[project documentation site](DATA_FORMAT.md#csv-format).\n\n# Quickstart\n\nIn this example we will load some demo data and classify it using a **Draco Pipeline**.\n\n## 1. Load and split the demo data\n\nThe first step is to load the demo data.\n\nFor this, we will import and call the `draco.demo.load_demo` function without any arguments:\n\n```python3\nfrom draco.demo import load_demo\n\ntarget_times, readings = load_demo()\n```\n\nThe returned objects are:\n\n* ``target_times``: A ``pandas.DataFrame`` with the ``target_times`` table data:\n\n ```\n turbine_id cutoff_time target\n 0 T001 2013-01-12 0\n 1 T001 2013-01-13 0\n 2 T001 2013-01-14 0\n 3 T001 2013-01-15 1\n 4 T001 2013-01-16 0\n ```\n\n* ``readings``: A ``pandas.DataFrame`` containing the time series data in the format explained above.\n\n ```\n turbine_id signal_id timestamp value\n 0 T001 S01 2013-01-10 323.0\n 1 T001 S02 2013-01-10 320.0\n 2 T001 S03 2013-01-10 284.0\n 3 T001 S04 2013-01-10 348.0\n 4 T001 S05 2013-01-10 273.0\n ```\n\nOnce we have loaded the `target_times` and before proceeding to training any Machine Learning\nPipeline, we will have split them in 2 partitions for training and testing.\n\nIn this case, we will split them using the [train_test_split function from scikit-learn](\nhttps://scikit-learn.org/stable/modules/generated/sklearn.model_selection.train_test_split.html),\nbut it can be done with any other suitable tool.\n\n```python3\nfrom sklearn.model_selection import train_test_split\n\ntrain, test = train_test_split(target_times, test_size=0.25, random_state=0)\n```\n\nNotice how we are only splitting the `target_times` data and not the `readings`.\nThis is because the pipelines will later on take care of selecting the parts of the\n`readings` table needed for the training based on the information found inside\nthe `train` and `test` inputs.\n\nAdditionally, if we want to calculate a goodness-of-fit score later on, we can separate the\ntesting target values from the `test` table by popping them from it:\n\n```python3\ntest_targets = test.pop(\'target\')\n```\n\n## 2. Exploring the available Pipelines\n\nOnce we have the data ready, we need to find a suitable pipeline.\n\nThe list of available Draco Pipelines can be obtained using the `draco.get_pipelines`\nfunction.\n\n```python3\nfrom draco import get_pipelines\n\npipelines = get_pipelines()\n```\n\nThe returned `pipeline` variable will be `list` containing the names of all the pipelines\navailable in the Draco system:\n\n```\n[\'lstm\',\n \'lstm_with_unstack\',\n \'double_lstm\',\n \'double_lstm_with_unstack\']\n```\n\nFor the rest of this tutorial, we will select and use the pipeline\n`lstm_with_unstack` as our template.\n\n```python3\npipeline_name = \'lstm_with_unstack\'\n```\n\n## 3. Fitting the Pipeline\n\nOnce we have loaded the data and selected the pipeline that we will use, we have to\nfit it.\n\nFor this, we will create an instance of a `DracoPipeline` object passing the name\nof the pipeline that we want to use:\n\n```python3\nfrom draco.pipeline import DracoPipeline\n\npipeline = DracoPipeline(pipeline_name)\n```\n\nAnd then we can directly fit it to our data by calling its `fit` method and passing in the\ntraining `target_times` and the complete `readings` table:\n\n```python3\npipeline.fit(train, readings)\n```\n\n## 4. Make predictions\n\nAfter fitting the pipeline, we are ready to make predictions on new data by calling the\n`pipeline.predict` method passing the testing `target_times` and, again, the complete\n`readings` table.\n\n```python3\npredictions = pipeline.predict(test, readings)\n```\n\n## 5. Evaluate the goodness-of-fit\n\nFinally, after making predictions we can evaluate how good the prediction was\nusing any suitable metric.\n\n```python3\nfrom sklearn.metrics import f1_score\n\nf1_score(test_targets, predictions)\n```\n\n## What\'s next?\n\nFor more details about **Draco** and all its possibilities and features, please check the\n[project documentation site](https://sintel-dev.github.io/Draco/)\nAlso do not forget to have a look at the [tutorials](\nhttps://github.com/sintel-dev/Draco/tree/master/tutorials)!\n'",,"2018/09/27, 14:50:42",1854,MIT,11,294,"2023/07/24, 12:12:04",6,54,71,7,93,0,0.8,0.5914893617021277,"2023/07/31, 16:02:44",v0.3.0,0,5,false,,false,true,,,https://github.com/sintel-dev,https://dai.lids.mit.edu/,,,,https://avatars.githubusercontent.com/u/13336772?v=4,,,
pyconturb,Constrained Stochastic Turbulence for Wind Energy Applications.,pyconturb,,custom,,Wind Energy,,,,,,,,,,https://gitlab.windenergy.dtu.dk/pyconturb/pyconturb,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
ORBIT,Offshore Renewable Balance-of-system Installation Tool computes capital costs and activity times for offshore wind plant balance-of-system (everything besides the turbine) costs.,WISDEM,https://github.com/WISDEM/ORBIT.git,github,,Wind Energy,"2023/04/21, 15:30:22",14,0,5,true,Python,WISDEM,WISDEM,Python,https://wisdem.github.io/ORBIT/,"b'ORBIT\n=====\n\nOffshore Renewables Balance of system and Installation Tool\n\n\n:Version: 1.0.8\n:Authors: `Jake Nunemaker `_, `Matt Shields `_, `Rob Hammond `_\n:Documentation: `ORBIT Docs `_\n\nInstallation\n------------\n\nAs of version 0.5.2, ORBIT is now pip installable with ``pip install orbit-nrel``.\n\nDevelopment Setup\n-----------------\n\nThe steps below are for more advanced users that would like to modify and\nand contribute to ORBIT.\n\nEnvironment\n~~~~~~~~~~~\n\nA couple of notes before you get started:\n - It is assumed that you will be using the terminal on MacOS/Linux or the\n Anaconda Prompt on Windows. The instructions refer to both as the\n ""terminal"", and unless otherwise noted the commands will be the same.\n - To verify git is installed, run ``git --version`` in the terminal. If an error\n occurs, install git using these `directions `_.\n - The listed installation process is intended to be the easiest for any OS\n to get started. An alternative setup that doesn\'t rely on Anaconda for\n setting up an environment can be followed\n `here `_.\n\nInstructions\n~~~~~~~~~~~~\n\n1. Download the latest version of `Miniconda `_\n for the appropriate OS. Follow the remaining `steps `_\n for the appropriate OS version.\n2. From the terminal, install pip by running: ``conda install -c anaconda pip``\n3. Next, create a new environment for the project with the following.\n\n .. code-block:: console\n\n conda create -n python=3.7 --no-default-packages\n\n To activate/deactivate the environment, use the following commands.\n\n .. code-block:: console\n\n conda activate \n conda deactivate \n\n4. Clone the repository:\n ``git clone https://github.com/WISDEM/ORBIT.git``\n5. Navigate to the top level of the repository\n (``/ORBIT/``) and install ORBIT as an editable package\n with following command.\n\n .. code-block:: console\n\n # Note the ""."" at the end\n pip install -e .\n\n # OR if you are you going to be contributing to the code or building documentation\n pip install -e \'.[dev]\'\n6. (Development only) Install the pre-commit hooks to autoformat code and\n check that tests pass.\n\n .. code-block:: console\n\n pre-commit install\n\nDependencies\n~~~~~~~~~~~~\n\n- Python 3.7+\n- marmot-agents\n- NumPy\n- SciPy\n- Matplotlib\n- OpenMDAO (>=3.2)\n\nDevelopment Specific\n~~~~~~~~~~~~~~~~~~~~\n\n- black\n- isort\n- pre-commit\n- pytest\n- sphinx\n- sphinx-rtd-theme\n\n\nRecommended packages for easy iteration and running of code:\n~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n\n- jupyterlab\n- pandas\n'",,"2019/12/09, 19:05:10",1416,Apache-2.0,62,632,"2023/04/04, 16:03:22",7,96,129,20,204,0,0.6,0.14589665653495443,"2020/07/01, 16:36:38",v0.5.0,0,6,false,,false,false,,,https://github.com/WISDEM,https://www.nrel.gov/wind/systems-engineering.html,"NREL National Wind Technology Center, Boulder, CO",,,https://avatars.githubusercontent.com/u/5444272?v=4,,,
WindTurbineClassification,Specification of 'normal' wind turbine operating behaviour for rapid anomaly detection.,nmstreethran,https://github.com/nmstreethran/WindTurbineClassification.git,github,"energy,renewable-energy,python,jupyter-notebook,wind-turbines",Wind Energy,"2023/10/04, 18:31:41",49,0,11,true,Python,,,Python,,"b""# WindTurbineClassification\n\n[](https://doi.org/10.5281/zenodo.2875795)\n[](https://raw.githubusercontent.com/nmstreethran/WindTurbineClassification/current/docs/nms_dissertation.pdf)\n\n\n\n***Specification of 'normal' wind turbine operating behaviour for rapid anomaly detection: through the use of machine learning algorithms***\n\nby Nithiya Streethran (nmstreethran@gmail.com)\n\nThis work is derived from my dissertation for the degree of Master of Science (MSc) in Renewable Energy Engineering at Heriot-Watt University, which was completed during a technical placement at Natural Power between May and August 2017.\n\n**Unfortunately, the datasets are proprietary industry data and I do not own the rights to distribute them to the public. Please do not contact me to request the datasets.**\n\n## Abstract\n\nMaximising the economic effectiveness of a wind farm is essential in making wind a more economic source of energy. This effectiveness can be increased through the reduction of operation and maintenance costs, which can be achieved through continuously monitoring the condition of wind turbines. An alternative to expensive condition monitoring systems, which can be uneconomical especially for older wind turbines, is to implement classification algorithms on supervisory control and data acquisition (SCADA) signals, which are collected in most wind turbines. Several publications were reviewed, which were all found to use separate algorithms to predict specific faults in advance. In reality, wind turbines tend to have multiple faults which may happen simultaneously and have correlations with one another. This project focusses on developing a methodology to predict multiple wind turbine faults in advance simultaneously by implementing classification algorithms on SCADA signals for a wind farm with 25 turbines rated at 2,500 kW, spanning a period of 30 months. The data, which included measurements of wind speed, active power and pitch angle, was labelled using corresponding downtime data to detect normal behaviour, faults and varying timescales before a fault occurs. Three different classification algorithms, namely decision trees, random forests and k nearest neighbours were tested using imbalanced and balanced training data, initially to optimise a number of hyperparameters. The random forest classifier produced the best results. Upon conducting a more detailed analysis on the performance of specific faults, it was found that the classifier was unable to detect the varying timescales before a fault with accuracy comparable to that of normal or faulty behaviour. This could have been due to the SCADA data, which are used as features, being unsuitable for detecting the faults, and there is potential to improve this by balancing only these classes.\n\n***Keywords:*** wind turbine, classification algorithm, SCADA, fault detection, condition monitoring\n\n## Scripts\n\n | Source | Output\n------ | -- | --\nProcess SCADA and downtime data | [](scripts/process_data.py) | [](https://nbviewer.org/github/nmstreethran/WindTurbineClassification/blob/current/docs/jupyter-notebooks/process_data.ipynb)\nDowntime categories | [](scripts/downtime_categories.py) | [](https://nbviewer.org/github/nmstreethran/WindTurbineClassification/blob/current/docs/jupyter-notebooks/downtime_categories.ipynb)\nMerge SCADA and downtime data | [](scripts/SCADA_downtime_merge.py)\nPower curves for all turbines | | [](https://nbviewer.org/github/nmstreethran/WindTurbineClassification/blob/current/docs/jupyter-notebooks/powercurves_all.ipynb)\n\n## License\n\nUnless otherwise stated:\n\n- Code and scripts are licensed under the [MIT License](https://opensource.org/licenses/MIT).\n- Content, images, and documentation are licensed under a [Creative Commons Attribution 4.0 International (CC-BY-4.0) License](https://creativecommons.org/licenses/by/4.0/).\n\nProject badges are generated using [Shields.io](https://shields.io/) and [Simple Icons](https://simpleicons.org/).\n""",",https://doi.org/10.5281/zenodo.2875795","2018/07/29, 23:36:18",1913,MIT,1,97,"2022/04/10, 16:19:44",0,1,1,0,563,0,0.0,0.0,"2020/11/27, 17:33:37",v1.1.0,0,1,false,,false,false,,,,,,,,,,,
ANYstructure,Offshore Steel structure calculation tool with automatic optimization and report generation.,audunarn,https://github.com/audunarn/ANYstructure.git,github,"dnvgl-os-c101,design-of-offshore-steel-structures,buckling,fatigue-analysis,analysis-framework,plate-thickness,beam-section,gui-based,dnvgl-rp-c201,optimization-tools,structural-engineering,naval-architecture,dnvgl-rp-c203,cylinders,plates,girder,dnv-os-c101",Wind Energy,"2023/10/23, 19:43:00",36,1,11,true,Python,,,"Python,JavaScript",,"b'# ANYstructure #\nANYstructure is the ultimate steel structure design tool for plate fields and cylinders! \nWeight optimization for all structures with machine learning capabilities. \nCalculations are based on DNV standards and rules\n### What\'s new in 4.10 ###\n* Corrected minor bug on membrane stresses for unstiffened cylinder.\n### What\'s new in 4.9.1 ###\n* Corrected bug in loading old save files\n* Corrected error on buckling flat plate calculation\n### What\'s new in 4.8 ###\n* Reporting table on cylinders.\n* Color coding on come cylinder properties.\n* Corrected error on additional hoop stress input for cylinders.\n### What\'s new in 4.7 ###\n* Corrected error on girder caluculation for cylinder buckling.\n* Added 1.10 load factor option for cylinder buckling.\n* Better compability with linux.\n* Python 3.11 based.\n### What\'s new in 4.4 ###\n* Backup and restore feature added.\n### What\'s new in 4.3 ###\n* General stability.\n* User friendliness.\n### What\'s new in 4.2 ###\n* Bug fixing.\n* Ukraininan theme.\n### What\'s new in 4.0 ###\n* Cylinder design and optimization!\n* Flat plate prescriptive buckling improved. Girder calculation added.\n* Updated GUI with color themes.\n### What\'s new in 3.3 ###\n* Extremely efficient Machine Learning version of PULS called ML-CL. Implemented for all optimizer options.\n* Calculation of Center of Gravity and Center of Buoyancy.\n* Reporting of weights and COG.\n* Lots of bug fixes.\n\n------------------------------------------------------------------------\n\n## The following is calculated: ##\n* Minimum plate thickness (DNV-OS-C101)\n* Minimum section modulus of stiffener/plate (DNVGL-OS-C101)\n* Minimum shear area (DNVGL-OS-C101)\n* Buckling (DNVGL-RP-C201)or PULS (licenced DNV software)\n* Buckling strength of shells DNV-RP-C202\n* PULS buckling (DNV license needed)\n* Machine learning buckling, PULS based\n* Fatigue for plate/stiffener connection (DNVGL-RP-C203)\n\nCompartments (tank pressures) are created automatically.\n\nPressures on external hull (or any other generic location) is defined by specifying equations.\n\nYou can optimize cylinders, single plate/stiffener field or multiple. Geometry of double bottom can be optimized.\n\nPLEASE CONTRIBUTE. REPORT BUGS ERRORS ETC.\nFor windows executable (.exe) version for non-coders, use the link below.\n\nFeedback: audunarn@gmail.com or discuss on github.\n\nPlease like, share or comment on LinkedIn: https://www.linkedin.com/in/audun-arnesen-nyhus-6aa17118/\n\nScreenshot (this example can be loaded from file ""ship_section_example.txt""):\n\n\n'",,"2018/04/10, 09:10:37",2024,MIT,48,834,"2023/08/30, 07:14:49",5,100,122,10,56,0,0.1,0.0028169014084507005,"2023/08/30, 07:53:17",4.10,0,3,false,,false,false,narest-qa/repo14,,,,,,,,,,
windrose,A graphic tool used by meteorologists to give a succinct view of how wind speed and direction are typically distributed at a particular location.,python-windrose,https://github.com/python-windrose/windrose.git,github,"python,matplotlib,windrose,wind,speed,pandas,numpy",Wind Energy,"2023/10/03, 09:40:55",315,128,43,true,Python,,python-windrose,"Python,Jupyter Notebook,TeX",https://python-windrose.github.io/windrose,"b'[](https://pypi.python.org/pypi/windrose/)\n[](https://pypi.python.org/pypi/windrose/)\n[](https://pypi.python.org/pypi/windrose/)\n[](https://pypi.python.org/pypi/windrose/)\n[](https://pypi.python.org/pypi/windrose/)\n[](https://github.com/python-windrose/windrose/actions/workflows/tests.yml)\n[](https://zenodo.org/badge/latestdoi/37549137)\n[](https://joss.theoj.org/papers/10.21105/joss.00268)\n\n# Windrose\n\nA [wind rose](https://en.wikipedia.org/wiki/Wind_rose) is a graphic tool used by meteorologists to give a succinct view of how wind speed and direction are typically distributed at a particular location. It can also be used to describe air quality pollution sources. The wind rose tool uses Matplotlib as a backend. Data can be passed to the package using Numpy arrays or a Pandas DataFrame.\n\nWindrose is a Python library to manage wind data, draw windroses (also known as polar rose plots), and fit Weibull probability density functions.\n\nThe initial use case of this library was for a technical report concerning pollution exposure and wind distributions analyzes. Data from local pollution measures and meteorologic information from various sources like Meteo-France were used to generate a pollution source wind rose.\n\nIt is also used by some contributors for teaching purpose.\n\n\n\nSome others contributors have used it to make figures for a [wind power plant control optimization study](https://www.nrel.gov/docs/fy17osti/68185.pdf).\n\nSome academics use it to track lightning strikes during high intensity storms. They are using it to visualize the motion of storms based on the relative position of the lightning from one strike to the next.\n\n## Try windrose on mybinder.org\n\n[](https://mybinder.org/v2/gh/python-windrose/windrose/HEAD?labpath=notebooks)\n\n## Install\n\n### Requirements\n\n- matplotlib http://matplotlib.org/\n- numpy http://www.numpy.org/\n- and naturally python https://www.python.org/ :-P\n\nOptional libraries:\n\n- Pandas http://pandas.pydata.org/ (to feed plot functions easily)\n- Scipy http://www.scipy.org/ (to fit data with Weibull distribution)\n- ffmpeg https://www.ffmpeg.org/ (to output video)\n- click http://click.pocoo.org/ (for command line interface tools)\n- seaborn https://seaborn.pydata.org/ (for easy subplots)\n\n### Install latest release version via pip\n\nA package is available and can be downloaded from PyPi and installed using:\n\n```bash\n$ pip install windrose\n```\n\n### Install latest development version\n\n```bash\n$ pip install git+https://github.com/python-windrose/windrose\n```\n\nor\n\n```bash\n$ git clone https://github.com/python-windrose/windrose\n$ python setup.py install\n```\n\n## Documentation\nFull documentation of library is available at https://python-windrose.github.io/windrose/\n\n## Community guidelines\n\nYou can help to develop this library.\n\n### Code of Conduct\n\nIf you are using Python Windrose and want to interact with developers, others users...\nwe encourage you to follow our [code of conduct](https://github.com/python-windrose/windrose/blob/master/CODE_OF_CONDUCT.md).\n\n### Contributing\n\nIf you discover issues, have ideas for improvements or new features, please report them.\n[CONTRIBUTING.md](https://github.com/python-windrose/windrose/blob/master/CONTRIBUTING.md) explains\nhow to contribute to this project.\n\n### List of contributors and/or notable users\nhttps://github.com/python-windrose/windrose/blob/main/CONTRIBUTORS.md\n'",",https://zenodo.org/badge/latestdoi/37549137","2015/06/16, 18:42:14",3053,CUSTOM,86,391,"2023/10/03, 09:40:59",23,120,233,47,22,2,0.0,0.3980891719745223,"2023/06/12, 17:45:05",v1.9.0,0,19,false,,true,true,"Alexsaez1990/proyecto_cultivos,Maxbeal/noisemodel,cathrinr/driving_change,imorinigo/dashboard_space_apps,MET-OM/metocean-stats,BahadoriMohammad/PyFluxPro,aesirkth/euroc23_electronics,adrnfk19/SoundingRocketDesign,Huell-Howitzer/docker-stacks,calstar/LE2,jisoo-j/RocketPy,JayMangukiya1614/TempDriverSchedule,OpenFDEM-geomechanics/Post-processing,os-simopt/wrfplotter,Kostyak7/Avellon_tech,CarstenOtl/WESP_RocketPy,miky21121996/HFR-project,RocketPy-Team/Infinity-API,juliet29/windows,edmundhong/formula1-dataanalysis,vieirasaulo/wasserweise,bavodenys/kitesurf_AI,sia-information-system/siaplotlib,Serg-NSD/SkillFactory-Data_Science,miky21121996/MO_project,rcushen/wind-turbines,camirmas/REStats,danilo-pilacuan/TechnicalEvaluationExchangeAPI,AnujaDassanayake/test-deployment-1,devSmarak972/Chakrasindhu-portal,jhaalbu/av_klima_nve,derekeden/aisio,Sakura-echos/CH4-caculate,robintw/PyAURN,SINTEF/blues-metocean-lib-examples,jhaalbu/klima_app,umweltschutzsoftware/windrose,jhaalbu/klima_docs,GEUS-Glaciology-and-Climate/GC-Net-evaluation,iliketoast-create/rocket,kaustuvchatterjee/vskp2,Thomasjkeel/Examining-The-Mid-Latitude-Jet-Stream,onaci/bard,Hariramakrishnan919/Textiledefect,Hariramakrishnan919/Minor-Project,timsta95/mcwindrose,jhaalbu/hendelse,nRiccobo/Leosphere,shanshan825/aiap12-Tong-Shanshan-359I,UNISvalbard/unisacsi,MurzikVasilyevich/info_perun,Mavengence/linkedin-job-scraper-data-analysis,KonstantinosF/Wekeo_Competition,claws-scot/CLAWS,anushavc/rapddetect,RocketPy-Team/RocketPy,OzFlux/PyFluxPro,JohnPapagiannakos/meteoAPI,cgadal-pythonpackages/pydune,os-simopt/wrftamer,jgmsantos/Livro-Python,jhaalbu/frost,alberduris/met-main,cycle13/climate,randulphmorales/romeomemo,SoftwareDevEngResearch/albatross,galendal/FACTS,raj-26singh/Wind-Farm-Simulation,raj-26singh/Wind-Farm-Simulator,nunesotavio/GestaoQualidadeAr_SC,christine-berlin/Capstone_WindPowerPredictions,goameli/Windenergy_Dashboard,vduseev/number-encoding,SooHooLee/test,brynjarmorka/climvis,JeromeSauer/Capstone_WindPowerPredicting,aeaa1998/lab_10_redes,ECMWFCode4Earth/MaLePoM,autoconsumes/aa,kirubhaharini/URECA,kastnerp/pedestrian-wind-comfort-weibull,nick123pig/wind-rose,Mapacherama/WindAnalyseTool,xiazemin/Wind-Speed-Analysis,kasiagunia/wind-turbine,nusrathfathima/Rain-in-Australia-Prediction,thilowrona/fatbox,vinitshah24/Australia-Rain-Classification,Rajvardhan7/Textile-Detection,Rajvardhan7/WindFarm-Layout-Optimization,julianasierra97/chat_bot,ElieKadoche/configuration_files,GregoireJan/met,halvorot/Floating-wind-turbine-stabilization-RL,jeeve/outilsflask,cgadal-pythonpackages/Wind_data,adamconrad7/Tree-Segmentation,launda/avguide,jhaalbu/av-klima,Wolfrax/tv,vlemeur/vlm-ds-toolbox,mozhemeng/flask-frame,tsunghao-huang/master_thesis,SmartPracticeschool/SBSPS-Challenge-2585-Forecasting-the-power-output-of-wind-farm-based-on-the-weather-conditions.,SmartPracticeschool/SBSPS-Challenge-4546-Predicting-the-energy-output-of-wind-turbine-based-on-weather-condition,SmartPracticeschool/SBSPS-Challenge-3618-Predicting-the-energy-output-of-wind-turbine-based-on-weather-condition,CENER-EPR/OWAbench,miky211296/climvis,clementbrizard/mapping-finland-weather,MeteoR-OI/bd-climato,varungv/django-web-crawler-web-app,SoftwareDefinedBuildings/XBOS,rsoutelino/pyromsgui,launda/learn_flask,ocni-dtu/weather_visualizer,abkfenris/mwac-wind,slj287/tempe_town_lake,weber-s/code_example_ige,oyan99/Wind-Speed-Analysis,cqcn1991/Wind-Speed-Analysis,leportella/oceanpyscripts,rmsare/ec-simple,sandtimething/adhd_diagnosis,nclv/Python-3.5,socib/HFRadarReports,akrherz/pyIEM,RCand/maritima,LionelR/pyair_utils",,https://github.com/python-windrose,,,,,https://avatars.githubusercontent.com/u/28726174?v=4,,,
SHARPy,Simulation of High Aspect Ratio aeroplanes and wind turbines in Python.,ImperialCollegeLondon,https://github.com/ImperialCollegeLondon/sharpy.git,github,"aeroelasticity,simulation,aeronautics,structures,structural-dynamics,wind-turbines",Wind Energy,"2023/10/18, 21:42:02",102,0,22,true,Python,Imperial College London,ImperialCollegeLondon,"Python,Dockerfile,Shell,CMake",https://imperial.ac.uk/aeroelastics/sharpy,"b""# Simulation of High Aspect Ratio aeroplanes in Python [SHARPy]\n\n\n\n[](https://ic-sharpy.readthedocs.io/en/main/?badge=main)\n[](https://codecov.io/gh/ImperialCollegeLondon/sharpy)\n[](https://opensource.org/licenses/BSD-3-Clause)\n[](https://joss.theoj.org/papers/f7ccd562160f1a54f64a81e90f5d9af9)\n[](https://doi.org/10.5281/zenodo.3531965)\n\nSHARPy is a nonlinear aeroelastic analysis package originally developed at the Department of Aeronautics, Imperial\nCollege London. It can be used for the structural, aerodynamic and aeroelastic analysis of flexible wings, aircraft and wind turbines. It is shared here under a BSD 3-Clause permissive license.\n\n\n\n### Contact\n\nFor more information on the [research team](http://www.imperial.ac.uk/aeroelastics/software/) developing SHARPy or to get \nin touch, [visit our homepage](http://www.imperial.ac.uk/aeroelastics).\n\n## Physical Models\n\nSHARPy is a modular aeroelastic solver that currently uses two specific models for the structural and aerodynamic response of the system.\n\nFor the structural model, SHARPy employs a geometrically-exact displacement-based composite beam formulation,\naugmented with Lagrange multipliers for additional kinematic constraints.\nThis model has the advantage of providing the solution directly in the physical problem's degrees of freedom, making the \ncoupling with the aerodynamic solver simple and not requiring any post-processing. The 1D beam formulation used limits \nthe analyses that can be done by SHARPy to slender structures, such as high aspect ratio wings.\n\nThe aerodynamic model utilises the Unsteady Vortex Lattice Method (UVLM). The aerodynamic surfaces are modelled as a thin\nvortex ring lattice with the boundary conditions enforced at the collocation points in the middle of the vortex rings.\nThe Kutta condition is also enforced at the trailing edge. The wake can be simulated by either additional vortex rings\nor by infinitely long horseshoe vortices, which are ideally suited for steady simulations only.\n\nThe aerodynamic model has recently been extended by a linear source panel method (SPM) to model nonlifting bodies for example fuselages. The SPM and UVLM can be coupled to model fuselage-wing configuration and a junction handling approach, based on phantom panels and circulation interpolation, has been added.\n\nThe input problems can be structural, aerodynamic or coupled, yielding an aeroelastic system.\n\n## [Capabilities](http://ic-sharpy.readthedocs.io/en/latest/content/capabilities.html)\n\nThe base solver SHARPy is a nonlinear aeroelastic analysis package that can be used on free-flying flexible aircraft,\nwings and wind turbines. In addition, it supports linearisation of these nonlinear systems about\narbitrary conditions and includes various tools such as: model reduction or frequency analysis.\n\nIn short, SHARPy offers (amongst others) the following solutions to the user:\n* Static aerodynamic, structural and aeroelastic solutions including fuselage effects\n* Finding trim conditions for aeroelastic configurations\n* Nonlinear, dynamic time domain simulations under a large number of conditions such as:\n + Prescribed trajectories.\n + Free flight.\n + Dynamic follower forces.\n + Control inputs in thrust, control surface deflection...\n + Arbitrary time-domain gusts, including non span-constant ones.\n + Full 3D turbulent fields.\n* Multibody dynamics with hinges, articulations and prescribed nodal motions:\n + Applicable to wind turbines.\n + Hinged aircraft.\n + Catapult assisted takeoffs.\n* Linear analysis:\n + Linearisation around a nonlinear equilibrium.\n + Frequency response analysis.\n + Asymptotic stability analysis.\n* Model order reduction:\n + Krylov-subspace reduction methods.\n + Several balancing reduction methods.\n\n## Documentation\n\nThe documentation for SHARPy can be found [here](http://ic-sharpy.readthedocs.io).\n\n## Installing SHARPy\n\nFor the latest documentation, see the \n[installation docs](https://ic-sharpy.readthedocs.io/en/latest/content/installation.html).\n\nSHARPy can also be obtained from Docker Hub to avoid compilation\nand platform-dependant issues. If you are interested, make sure you check \nthe [SHARPy Docker distribution docs](https://ic-sharpy.readthedocs.io/en/latest/content/installation.html#using-sharpy-from-a-docker-container).\n\n## Contributing and Bug reports\n\nIf you think you can add a useful feature to SHARPy, want to write documentation or you encounter a bug, by all means, \ncheck out the [collaboration guide](https://ic-sharpy.readthedocs.io/en/latest/content/contributing.html).\n\n## Citing SHARPy\n\nSHARPy has been published in the Journal of Open Source Software (JOSS) and the relevant paper can be found\n[here](https://joss.theoj.org/papers/10.21105/joss.01885).\n\nIf you are using SHARPy for your work, please remember to cite it using the paper in JOSS as:\n\n`del Carre et al., (2019). SHARPy: A dynamic aeroelastic simulation toolbox for very flexible aircraft and wind\nturbines. Journal of Open Source Software, 4(44), 1885, https://doi.org/10.21105/joss.01885`\n\nThe bibtex entry for this citation is:\n\n```\n@Article{delCarre2019,\ndoi = {10.21105/joss.01885},\nurl = {https://doi.org/10.21105/joss.01885},\nyear = {2019},\nmonth = dec,\npublisher = {The Open Journal},\nvolume = {4},\nnumber = {44},\npages = {1885},\nauthor = {Alfonso del Carre and Arturo Mu{\\~{n}}oz-Sim\\'on and Norberto Goizueta and Rafael Palacios},\ntitle = {{SHARPy}: A dynamic aeroelastic simulation toolbox for very flexible aircraft and wind turbines},\njournal = {Journal of Open Source Software}\n}\n```\n\n\n## Continuous Integration Status\n\nSHARPy uses Continuous Integration to control the integrity of its code. The status in the release and develop branches\nis:\n\nMain\n\n\n\nDevelop\n\n""",",https://doi.org/10.5281/zenodo.3531965,https://doi.org/10.21105/joss.01885`\n\nThe,https://doi.org/10.21105/joss.01885","2016/10/07, 10:11:51",2574,BSD-3-Clause,301,4255,"2023/10/18, 21:49:55",20,134,188,32,6,6,0.4,0.7001184132622853,"2023/10/18, 15:52:35",2.2,0,18,false,,false,true,,,https://github.com/ImperialCollegeLondon,,Imperial College London,,,https://avatars.githubusercontent.com/u/1220306?v=4,,,
WindSE,A Python package that uses a FEniCS backend to perform wind farm simulations and optimization.,NREL,https://github.com/NREL/WindSE.git,github,,Wind Energy,"2023/07/11, 19:52:21",43,0,4,true,Python,National Renewable Energy Laboratory,NREL,"Python,Shell",,"b'WindSE: Wind Systems Engineering\n~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n\nSimple Description:\n===================\n\nWindSE is a python package that uses a FEniCS backend to perform wind farm simulations and optimization. Documentation can be found at: https://windse.readthedocs.io/en/latest/ \n\nQuick Start-Up Guide:\n=====================\n\nIt is easiest to run WindSE within a conda environment. To install conda check this link: `Conda Installation `_. Additionally, WindSE has been tested on MacOS (Catalina 10.15) and Linux (CentOS 7). Windows is not recommended. \n\nQuick Installation Instructions:\n--------------------------------\n\nThe easiest way to install windse is to run::\n\n sh install.sh \n\nThen the enviroment can be activated using::\n\n conda activate \n\nQuick Demo Instructions:\n------------------------\n\nActivate the conda environment using::\n\n conda activate \n\nThen to run a simple demo, navigate to /demos/documented/Yaml_Examples/ and run::\n\n windse run 0-wind_farm_2D.yaml\n\nThe output of this simulation will be located in the output/2_5D_Wind_Farm/ folder. Use `Paraview `_ to visualize the results in the solutions/ folder. To learn what parameter can be set in the yaml file, head to the `Parameter Documentation `_.\n\n\n\n'",,"2019/04/26, 21:00:20",1642,CUSTOM,1,483,"2023/07/07, 21:49:19",19,79,91,11,109,0,0.0,0.425974025974026,"2023/07/11, 20:07:38",2023.07.01,0,5,false,,false,false,,,https://github.com/NREL,http://www.nrel.gov,"Golden, CO",,,https://avatars.githubusercontent.com/u/1906800?v=4,,,
WEIS,WEIS is a framework that combines multiple tools to enable design optimization of floating offshore wind turbines.,WISDEM,https://github.com/WISDEM/WEIS.git,github,,Wind Energy,"2023/10/19, 15:29:10",41,0,15,true,Fortran,WISDEM,WISDEM,"Fortran,Roff,Python,C,C++,Jupyter Notebook,CMake,TeX,MATLAB,Scheme,F*,Makefile,Batchfile,Shell,Dockerfile,M",,"b'# WEIS\n\n[](https://coveralls.io/github/WISDEM/WEIS?branch=develop)\n[](https://github.com/WISDEM/WEIS/actions)\n[](https://weis.readthedocs.io/en/develop/?badge=develop)\n[](https://zenodo.org/badge/latestdoi/289320573)\n\nWEIS, Wind Energy with Integrated Servo-control, performs multifidelity co-design of wind turbines. WEIS is a framework that combines multiple NREL-developed tools to enable design optimization of floating offshore wind turbines.\n\nAuthor: [NREL WISDEM & OpenFAST & Control Teams](mailto:systems.engineering@nrel.gov)\n\n## Version\n\nThis software is a version 0.0.1.\n\n## Documentation\n\nSee local documentation in the `docs`-directory or access the online version at \n\n## Packages\n\nWEIS integrates in a unique workflow four models:\n* [WISDEM](https://github.com/WISDEM/WISDEM) is a set of models for assessing overall wind plant cost of energy (COE).\n* [OpenFAST](https://github.com/OpenFAST/openfast) is the community model for wind turbine simulation to be developed and used by research laboratories, academia, and industry.\n* [TurbSim](https://www.nrel.gov/docs/fy09osti/46198.pdf) is a stochastic, full-field, turbulent-wind simulator.\n* [ROSCO](https://github.com/NREL/ROSCO) provides an open, modular and fully adaptable baseline wind turbine controller to the scientific community.\n\nIn addition, three external libraries are added:\n* [pCrunch](https://github.com/NREL/pCrunch) is a collection of tools to ease the process of parsing large amounts of OpenFAST output data and conduct loads analysis.\n* [pyOptSparse](https://github.com/mdolab/pyoptsparse) is a framework for formulating and efficiently solving nonlinear constrained optimization problems.\n\nSoftware Model Versions:\nSoftware | Version\n--- | ---\nOpenFAST | 3.2.1\nROSCO | 2.6.0\n\nThe core WEIS modules are:\n * _aeroelasticse_ is a wrapper to call [OpenFAST](https://github.com/OpenFAST/openfast)\n * _control_ contains the routines calling the [ROSCO_Toolbox](https://github.com/NREL/ROSCO_toolbox) and the routines supporting distributed aerodynamic control devices, such trailing edge flaps\n * _gluecode_ contains the scripts glueing together all models and libraries\n * _multifidelity_ contains the codes to run multifidelity design optimizations\n * _optimization_drivers_ contains various optimization drivers\n * _schema_ contains the YAML files and corresponding schemas representing the input files to WEIS\n\n## Installation\n\nOn laptop and personal computers, installation with [Anaconda](https://www.anaconda.com) is the recommended approach because of the ability to create self-contained environments suitable for testing and analysis. WEIS requires [Anaconda 64-bit](https://www.anaconda.com/distribution/). WEIS is currently supported on Linux, MAC and Windows Sub-system for Linux (WSL). Installing WEIS on native Windows is not supported.\n\nThe installation instructions below use the environment name, ""weis-env,"" but any name is acceptable. For those working behind company firewalls, you may have to change the conda authentication with `conda config --set ssl_verify no`. Proxy servers can also be set with `conda config --set proxy_servers.http http://id:pw@address:port` and `conda config --set proxy_servers.https https://id:pw@address:port`.\n\n0. On the DOE HPC system eagle, make sure to start from a clean setup and type\n\n module purge\n module load conda \n\n1. Setup and activate the Anaconda environment from a prompt (WSL terminal on Windows or Terminal.app on Mac)\n\n conda env create --name weis-env -f https://raw.githubusercontent.com/WISDEM/WEIS/main/environment.yml python=3.9\n conda activate weis-env # (if this does not work, try source activate weis-env)\n sudo apt update # (WSL only, assuming Ubuntu)\n\n2. Use conda to add platform specific dependencies.\n\n conda config --add channels conda-forge\n conda install -y petsc4py mpi4py # (Mac / Linux only) \n conda install -y compilers # (Mac only) \n sudo apt install gcc g++ gfortran libblas-dev liblapack-dev -y # (WSL only, assuming Ubuntu)\n\n3. Clone the repository and install the software\n\n git clone https://github.com/WISDEM/WEIS.git\n cd WEIS\n git checkout branch_name # (Only if you want to switch branches, say ""develop"")\n python setup.py develop # (The common ""pip install -e ."" will not work here)\n\n4. Instructions specific for DOE HPC system Eagle. Before executing the setup script, do:\n\n module load comp-intel intel-mpi mkl\n module unload gcc\n python setup.py develop\n\n**NOTE:** To use WEIS again after installation is complete, you will always need to activate the conda environment first with `conda activate weis-env` (or `source activate weis-env`). On Eagle, make sure to reload the necessary modules\n\n## Developer guide\n\nIf you plan to contribute code to WEIS, please first consult the [developer guide](https://weis.readthedocs.io/en/latest/how_to_contribute_code.html).\n\n## Feedback\n\nFor software issues please use . \n'",",https://zenodo.org/badge/latestdoi/289320573","2020/08/21, 16:54:28",1160,Apache-2.0,2,4365,"2023/10/24, 13:02:35",15,181,226,31,1,3,0.3,0.5947242206235013,"2022/10/05, 18:39:03",v1.1,0,23,false,,false,false,,,https://github.com/WISDEM,https://www.nrel.gov/wind/systems-engineering.html,"NREL National Wind Technology Center, Boulder, CO",,,https://avatars.githubusercontent.com/u/5444272?v=4,,,
pyNuMAD,"An object-oriented, open-source software written in Python which facilitates the creation and analysis of three-dimensional models of wind turbine blades.",sandialabs,https://github.com/sandialabs/pyNuMAD.git,github,,Wind Energy,"2023/10/19, 19:59:53",3,0,3,true,Python,Sandia National Laboratories,sandialabs,Python,https://sandialabs.github.io/pyNuMAD/,"b"" # pyNuMAD\n[pyNuMAD (Python Numerical Manufacturing And Design)](https://github.com/sandialabs/pyNuMAD) is an object-oriented, open-source software program written in Python which simplifies the process of creating a three-dimensional model of a wind turbine blade. The tool organizes all blade information including aerodynamic and material properties as well as material placement into an\nintuitive API for use with other softwares. The purpose of pyNuMAD is to provide an intermediary between raw blade data in the form of yaml, excel, xml files and analytical platforms\n(ANSYS, Cubit, openFAST, etc).\n\nFor any questions or support [create a new issue](https://github.com/sandialabs/pyNuMAD/issues/new) on GitHub.\n\n## Documentation\nDocumentation for pynumad is accessible at https://sandialabs.github.io/pyNuMAD/.\n\n\n\n## Examples\n\nStep-by-step examples are located in the [examples](https://github.com/sandialabs/pyNuMAD/tree/main/examples) folder. Follow allong in the documentation.\n\n## License\n\npyNuMAD is licensed under BSD 3-clause license. Please see the\n[LICENSE](https://github.com/sandialabs/pyNuMAD/blob/main/LICENSE) included in\nthe source code repository for more details.\n\n## Acknowledgements \n\npyNuMAD is currently being developed with funding from Department of Energy's\n(DOE) Energy Efficiency and Renewable Energy (EERE) Wind Energy Technology Office (WETO). \n""",,"2023/06/29, 17:01:10",118,BSD-3-Clause,39,39,"2023/10/19, 20:00:06",5,34,35,35,6,1,0.1,0.25,"2023/10/19, 20:15:33",1.0.0,0,3,false,,false,false,,,https://github.com/sandialabs,https://software.sandia.gov,United States,,,https://avatars.githubusercontent.com/u/4993680?v=4,,,
HAMS,An open-source computer program for the analysis of wave diffraction and radiation of three-dimensional floating or submerged structures.,YingyiLiu,https://github.com/YingyiLiu/HAMS.git,github,"wave-structure-interaction,boundary-element-method,potential-flow-theory,offshore-wind-platforms,ocean-wave-energy-converters",Wind Energy,"2023/09/08, 04:44:27",75,0,21,true,Roff,,,"Roff,Fortran,Python,Makefile,Batchfile",,"b'\xef\xbb\xbf\n\n# HAMS\n**An open-source computer program for the analysis of wave diffraction and radiation of three-dimensional floating or submerged structures.**\n\n[](http://www.apache.org/licenses/LICENSE-2.0)\n\n ![]()
\n \nHAMS (Hydrodynamic Analysis of Marine Structures) is a free open-source software to analyse wave-structure interactions in the frequency domain. It is based on the boundary integral equation method within the framework of the potential flow theory. The code is currently written in FORTRAN 90. It has been developed by the author Yingyi Liu for nearly a decade. \n\nHAMS is released in the hope that it will contribute to eliminating the inequality (for those who are not able to afford to purchase a costly commercial BEM software) in the continuous research developments related to offshore engineering and ocean renewable energies.\n\nHAMS is freely distributed under the Apache License, Version 2.0, http://www.apache.org/licenses/LICENSE-2.0, and may be modified and extended by researchers who intend to enhance its capabilities and port the code to other platforms. \n\nThe success of HAMS should to a large extent be attributed to Prof. Bin Teng (Dalian University of Technology), who has tutored me the theory of potential flow in marine hydrodynamics and the programming skills using the [Boundary Element Method](https://en.wikipedia.org/wiki/Boundary_element_method). The code structure and the coding style of HAMS are exactly two of the examples that I have learned and inherited from Prof. Bin Teng.\n\n## Theoretical Basis\n\n ![]()
\n \n### - Please refer to the following papers for the theory:\n\nThe theory of panel method that has been used by HAMS is written in detail in the following two papers:\n\n* Yingyi Liu (2021). Introduction of the Open-Source Boundary Element Method Solver HAMS to the Ocean Renewable Energy Community. In: Proc. of the 14th European Wave and Tidal Energy Conference, Plymouth, UK, Sep. 5\xe2\x80\x939, 2021.\n\n* Yingyi Liu (2019). ""HAMS: A Frequency-Domain Preprocessor for Wave-Structure Interactions\xe2\x80\x94Theory, Development, and Application."" _Journal of Marine Science and Engineering_, 7: 81.\n\n* Yingyi Liu, Changhong Hu, Makoto Sueyoshi, Hidetsugu Iwashita, Masashi Kashiwagi (2016). ""Motion response prediction by hybrid panel-stick models for a semi-submersible with bracings."" _Journal of Marine Science and Technology_, 21:742\xe2\x80\x93757.\n\nThe deepwater Green function is using a fortran subroutine (https://github.com/Hui-Liang/Green-function-in-deep-water) developed by Dr. Hui Liang. For the detailed theory you may refer to the following three papers:\n\n* Hui Liang, Huiyu Wu, and Francis Noblesse (2018). ""Validation of a global approximation for wave diffraction-radiation in deep water."" _Applied Ocean Research_, 74 : 80-86.\n\n* Huiyu Wu, Hui Liang, and Francis Noblesse (2018). ""Wave component in the Green function for diffraction radiation of regular water waves."" _Applied Ocean Research_, 81: 72-75.\n\n* Huiyu Wu, Chenliang Zhang, Yi Zhu, Wei Li, Decheng Wan, Francis Noblesse (2017). ""A global approximation to the Green function for diffraction radiation of water waves."" _European Journal of Mechanics-B/Fluids_, 65: 54-64.\n\nThe finite-depth Green function is using a fortran subroutine FinGreen3D (https://github.com/YingyiLiu/FinGreen3D) developed by Dr. Yingyi Liu. For the detailed theory you may refer to the following two papers:\n\n* Yingyi Liu, Shigeo Yoshida, Changhong Hu, Makoto Sueyoshi, Liang Sun, Junliang Gao, Peiwen Cong, Guanghua He (2018). ""A reliable open-source package for performance evaluation of floating renewable energy systems in coastal and offshore regions."" _Energy Conversion and Management_, 174: 516-536.\n\n* Yingyi Liu, Hidetsugu Iwashita, Changhong Hu (2015). ""A calculation method for finite depth free-surface green function."" _International Journal of Naval Architecture and Ocean Engineering_, 7(2): 375-389.\n\nPlease cite appropriately the above papers in your relevant publications, reports, etc., if the HAMS code or its executable program has contributed to your work.\n\n## Generated numerical results\n\n### - Hydrodynamic coefficients\n\n ![]()
\n\n### - Wave excitation force\n\n ![]()
\n\n### - Motion RAOs\n\n ![]()
\n\n### - Free-surface elevation\n\n ![]()
\n\n## Features\n\n### - Mesh element type\n\n* HAMS can import meshes containing triangular panel type, quadrilateral panel type, or both.\n ![]()
\n\n### - OpenMP parallel processing\n\n* HAMS can be run in the parallel mode using OpenMP techniques on PC\'s with multiple processors (CPU\'s).\n ![]()
\n \n### - Computational efficiency\n\n* The following graph shows an example of DeepCwind semisubmersible using 8 threads for the computation:\n ![]()
\n\n## Useful Links\n\nThe following open-source software can be used to view the HAMS results: \n[1]. [BEMRosetta](https://github.com/izabala123/BEMRosetta). Developed by I\xc3\xb1aki Zabala, Markel Pe\xc3\xb1alba, Yerai Pe\xc3\xb1a-Sanchez.
\n[2]. [BEMIO](https://wec-sim.github.io/bemio/). Developed by National Renewable Energy Laboratory and Sandia National Laboratories.
\n\nYou may need HAMS to do the frequency-domain pre-processing before you use the following programs: \n[1]. [FAST](https://www.nrel.gov/wind/nwtc/fast.html) or [OpenFAST](https://openfast.readthedocs.io/en/master/). Developed by National Renewable Energy Laboratory.
\n[2]. [WEC-Sim](https://wec-sim.github.io/WEC-Sim/). Developed by National Renewable Energy Laboratory and Sandia National Laboratories.
\n\nUsed by other open-source software: \n[1]. [pyHAMS](https://github.com/WISDEM/pyHAMS). Developed by Garrett Barter, National Renewable Energy Laboratory.
\n[2]. [RAFT](https://github.com/WISDEM/RAFT). Developed by Matt Hall, Stein Housner, David Ogden, Garrett Barter, National Renewable Energy Laboratory.
\n\n## License\n\nCode original author: Yingyi Liu (\xe5\x8a\x89\xe7\x9b\x88\xe6\xba\xa2), [Google Scholar](https://scholar.google.co.jp/citations?hl=ja&user=mpR3MvAAAAAJ&view_op=list_works&sortby=pubdate).\n\nHAMS is free software: you can redistribute it and/or modify it under the terms of the Apache License as published by the Free Software Foundation, either version 2 of the License, or (at your option) any later version.\n\nHAMS is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the Apache License for details. You should have received a copy of the Apache License along with HAMS. If not, see http://www.apache.org/licenses/LICENSE-2.0
\n\n\n'",,"2020/10/30, 15:48:46",1090,Apache-2.0,27,205,"2023/09/05, 08:31:05",1,12,31,14,50,0,0.0,0.19387755102040816,,,0,3,false,,false,false,,,,,,,,,,,
brightwind,A Python library aims to empower wind resource analysts and establish a common industry standard toolset.,brightwind-dev,https://github.com/brightwind-dev/brightwind.git,github,,Wind Energy,"2023/08/11, 17:00:09",38,1,4,true,Python,brightwind,brightwind-dev,Python,,"b""--------------\n```\n __ _ __ __ _ __\n / /_ _____(_)___ / /_ / /__ __(_)___ ___ / /\n / __ \\/ ___/ / __ \\/ __ \\/ __/ | /| / / / __ \\/ __ /\n / /_/ / / / / /_/ / / / / /_ | |/ |/ / / / / / /_/ /\n /_.___/_/ /_/\\__, /_/ /_/\\__/ |__/|__/_/_/ /_/\\__,_/\n /____/\n ```\n **A Python library primarily for wind resource assessments.**\n\n--------------\n\n
\n\nBrightwind is a Python library specifically built for wind analysis. It can load in wind speed, wind direction and \nother metrological timeseries data. There are various plots you can use to understand this data and to find any \npotential issues. You can perform many common functions to the data such as shear and long-term adjustments. The \nresulting adjusted data is then outputted as a frequency distribution tab file which can be used in wind analysis \nsoftware such as WAsP.\n\nThis library can also be used for solar resource analysis.\n\n
\n\n---\n### Installation\n\nYou can use pip from the command line to install the library.\n\n```\nC:\\Users\\Stephen> pip install brightwind\n```\nIt is advisable to use a separate environment to avoid any dependency clashes with other libraries such as Pandas, Numpy \nor Matplotlib you may already have installed.\n\n
\n\nFor those that do not have Python installed and are just getting started, we recommend installing Anaconda. Anaconda is \na Python distribution for scientific computing and so provides everything you need, Python, pip and Jupyter Notebook \nalong with libraries such as Pandas, Numpy and Matplotlib. Datacamp provide a good tutorial for [installing \nAnaconda on Windows](https://www.datacamp.com/tutorial/installing-anaconda-windows) to get started.\n\nOnce Anaconda is installed, you can use the **Anaconda Prompt** to run the above command line `pip install brightwind`. \nOr first use **Anaconda Navigator** to create an environment.\n\n---\n### Documentation\n\nDocumentation on how to get setup and use the library can be found at https://brightwind-dev.github.io/brightwind-docs/\n\n
\n\nExample usage of the brightwind library is shown below using Jupyter Notebook. Jupyter Notebook is a powerful way to \nimmediately see the results of code you have written.\n
\n\n\n\n\n\n
\n\n\n\n\n
\n\n##### Features\nThe library provides wind analysts with easy to use tools for working with\nmeteorological data. It supports loading of meteorological data, averaging,\nfiltering, plotting, correlations, shear analysis, long term adjustments, etc.\nThe library can then export a resulting long term adjusted tab file to be used in\nother wind analysis software.\n\n
\n\n##### Benefits\nThe key benefits to an open-source library is that it provides complete transparency\nand traceability. Anyone in the industry can review any part of the code and suggest changes,\nthus creating a standardised, validated toolkit for the industry.\n\nBy default, during an assessment every manipulation or adjustment made to the wind data is\ncontained in a single file. This can easily be reviewed and checked by internal reviewers or,\nas the underlying code is open-sourced, there is no reason why this file cannot be sent to\n3rd parties for review thus increasing the effectiveness of a banks due diligence.\n\n
\n\n##### License\nThe library is licensed under the MIT license.\n\n
\n\n---\n### Test datasets\nA test dataset is included in this repository and is used to demonstrate function and test functions in the code. \nOther files and datasets are also included to complement this demo dataset. These are outlined below:\n\n
\n\n| Dataset | Source | Notes |\n|:--------------------- |:-------------|:-----|\n| demo_data.csv | BrightWind | A modified 2 year met mast dataset in csv and Campbell Scientific format. |\n| MERRA-2_XX_2000-01-01_2017-06-30.csv | NASA [GES DISC](https://disc.gsfc.nasa.gov/) | 4 x MERRA-2 18-yr datasets to complement the demo data for long term analyses. |\n| demo_cleaning_file.csv | BrightWind | A file containing information on what periods to clean out from the demo data. |\n| windographer_flagging_log.txt | BrightWind | The same cleaning info as found in 'demo_cleaning_file.csv' formatted as a Windographer flagging file. |\n| demo_data_iea43_wra_data_model.json | BrightWind | A JSON file formatted according to the IEA Wind Task 43 [WRA Data Model](https://github.com/IEA-Task-43/digital_wra_data_standard) standard which describes the mast configuration for the demo data. |\n\n
\n\n---\n### Contributing\nIf you wish to be involved or find out more please contact stephen@brightwindanalysis.com.\n\nMore information can be found in the [contributing.md](https://github.com/brightwind-dev/brightwind/blob/master/contributing.md) section of the website.\n\n
\n""",,"2018/12/11, 15:49:26",1779,MIT,240,1547,"2023/10/10, 08:14:29",70,179,349,61,15,1,0.2,0.6412273800157356,"2023/06/01, 14:29:54",v2.1.0,0,10,false,,false,true,narest-qa/repo50,,https://github.com/brightwind-dev,,,,,https://avatars.githubusercontent.com/u/45794645?v=4,,,
NRWAL,A library of offshore wind cost equations.,NREL,https://github.com/NREL/NRWAL.git,github,,Wind Energy,"2023/09/13, 18:08:28",15,3,5,true,Python,National Renewable Energy Laboratory,NREL,"Python,Shell",https://nrel.github.io/NRWAL/,"b""*****************\nWelcome to NRWAL!\n*****************\n\n.. image:: https://github.com/NREL/NRWAL/workflows/Documentation/badge.svg\n :target: https://nrel.github.io/NRWAL/\n\n.. image:: https://github.com/NREL/NRWAL/workflows/Pytests/badge.svg\n :target: https://github.com/NREL/NRWAL/actions?query=workflow%3A%22Pytests%22\n\n.. image:: https://github.com/NREL/NRWAL/workflows/Lint%20Code%20Base/badge.svg\n :target: https://github.com/NREL/NRWAL/actions?query=workflow%3A%22Lint+Code+Base%22\n\n.. image:: https://img.shields.io/pypi/pyversions/NREL-NRWAL.svg\n :target: https://pypi.org/project/NREL-NRWAL/\n\n.. image:: https://badge.fury.io/py/NREL-NRWAL.svg\n :target: https://badge.fury.io/py/NREL-NRWAL\n\n.. image:: https://anaconda.org/nrel/nrel-NRWAL/badges/version.svg\n :target: https://anaconda.org/nrel/nrel-NRWAL\n\n.. image:: https://anaconda.org/nrel/nrel-NRWAL/badges/license.svg\n :target: https://anaconda.org/nrel/nrel-NRWAL\n\n.. image:: https://codecov.io/gh/nrel/NRWAL/branch/main/graph/badge.svg?token=NB29X039VU\n :target: https://codecov.io/gh/nrel/NRWAL\n\n.. image:: https://zenodo.org/badge/319377095.svg\n :target: https://zenodo.org/badge/latestdoi/319377095\n\n.. image:: https://mybinder.org/badge_logo.svg\n :target: https://mybinder.org/v2/gh/NREL/NRWAL/HEAD\n\n\n.. inclusion-intro\n\nThe National Renewable Energy Laboratory Wind Analysis Library (NRWAL):\n\n#. A library of offshore wind cost equations (plus new energy technologies like marine hydro!)\n#. Easy equation manipulation without editing source code\n#. Full continental-scale integration with the NREL Renewable Energy Potential Model (reV)\n#. Ready-to-use configs for basic users\n#. Dynamic python tools for intuitive equation handling\n#. One seriously badass sea unicorn\n\nTo get started with NRWAL, check out the `NRWAL Config documentation `_ or the `NRWAL example notebook `_. You can also launch the notebook in an interactive jupyter shell right in your browser without any downloads or software using `binder `_. \n\nReady to build a model with NRWAL but don't want to contribute to the library? No problem! Check out the example getting started project `here `_.\n\nHere is the important stuff:\n\n - `The NRWAL Equation Library `_.\n - `Default NRWAL Configs `_.\n\nInstalling NRWAL\n================\n\nOption 1: Install from PIP or Conda (recommended for analysts):\n---------------------------------------------------------------\n\n1. Create a new environment:\n ``conda create --name nrwal``\n\n2. Activate directory:\n ``conda activate nrwal``\n\n3. Install reVX:\n 1) ``pip install NREL-NRWAL`` or\n 2) ``conda install nrel-nrwal --channel=nrel``\n\nOption 2: Clone repo (recommended for developers)\n-------------------------------------------------\n\n1. from home dir, ``git clone https://github.com/NREL/NRWAL.git``\n 1) enter github username\n 2) enter github password\n\n2. Create ``NRWAL`` environment and install package\n 1) Create a conda env: ``conda create -n nrwal``\n 2) Run the command: ``conda activate nrwal``\n 3) cd into the repo cloned in 1.\n 4) prior to running ``pip`` below, make sure the branch is correct (install\n from master!)\n 5) Install ``NRWAL`` and its dependencies by running:\n ``pip install .`` (or ``pip install -e .`` if running a dev branch\n or working on the source code)\n\nNRWAL Variables for Offshore Wind (OSW)\n=======================================\n\n.. list-table:: NRWAL Inputs\n :widths: auto\n :header-rows: 1\n\n * - Variable Name\n - Long Name\n - Source\n - Units\n * - `aeff`\n - Array Efficiency\n - `array_efficiency` input layer, computed from ORBIT\n - `%`\n * - `capex_multi`\n - CAPEX Multiplier\n - Supplied by user\n - unit-less\n * - `depth`\n - Water depth (positive values)\n - `bathymetry` input layer\n - m\n * - `dist_a_to_s`\n - Distance from assembly area to site\n - Computed from `assembly_area` input layer\n - km\n * - `dist_op_to_s`\n - Distance from operating port to site\n - `ports_operations` input layer\n - km\n * - `dist_p_to_a`\n - Distance from port (construction no-limit) to assembly area\n - `assembly_area` input layer\n - km\n * - `dist_p_to_s`\n - Distance from construction port to site\n - `ports_construction` input layer\n - km\n * - `dist_p_to_s_nolimit`\n - Distance from no-limit construction port to site\n - `ports_construction_nolimit` input layer\n - km\n * - `dist_s_to_l`\n - Distance site to nearest land\n - `dist_to_coast` input layer\n - km\n * - `fixed_downtime`\n - Average weather downtime for fixed structure turbines\n - `weather_downtime_fixed_bottom` input layer\n - fraction\n * - `floating_downtime`\n - Average weather downtime for floating structure turbines\n - `weather_downtime_floating` input layer\n - fraction\n * - `gcf`\n - Gross capacity factor\n - Computed by reV / SAM with losses == 0\n - unit-less\n * - `hs_average`\n - Significant wave height to determine weather downtime\n - `weather_downtime_mean_wave_height_buoy` input layer\n - m\n * - `num_turbines`\n - Number of turbines in array\n - Supplied by user\n - unit-less\n * - `transmission_multi`\n - Tranmission cost multiplier\n - Supplied by user\n - unit-less\n * - `turbine_capacity`\n - Capacity of each turbine in the array\n - Supplied by user\n - MW\n\nRecommended Citation\n====================\n\nIf using the NRWAL software (replace with current version and DOI):\n\n - Grant Buster, Jake Nunemaker, and Michael Rossol. The National Renewable Energy Laboratory Wind Analysis Library (NRWAL). https://github.com/NREL/NRWAL (version v0.0.2), 2021. https://doi.org/10.5281/zenodo.4705961.\n\nIf using the Offshore Wind (OSW) cost equations:\n\n - Beiter, Philipp, Walter Musial, Aaron Smith, Levi Kilcher, Rick Damiani, Michael Maness, Senu Sirnivas, Tyler Stehly, Vahan Gevorgian, Meghan Mooney, and George Scott. \xe2\x80\x9cA Spatial-Economic Cost-Reduction Pathway Analysis for U.S. Offshore Wind Energy Development from 2015\xe2\x80\x932030.\xe2\x80\x9d National Renewable Energy Lab. (NREL), Golden, CO (United States), September 1, 2016. https://doi.org/10.2172/1324526. https://www.nrel.gov/docs/fy16osti/66579.pdf.\n\nIf using the marine energy reference model (RM) cost models:\n\n - https://energy.sandia.gov/programs/renewable-energy/water-power/projects/reference-model-project-rmp/\n - Jenne, D. S., Y. H. Yu, and V. Neary. \xe2\x80\x9cLevelized Cost of Energy Analysis of Marine and Hydrokinetic Reference Models: Preprint.\xe2\x80\x9d National Renewable Energy Lab. (NREL), Golden, CO (United States), April 24, 2015. https://www.osti.gov/biblio/1215196-levelized-cost-energy-analysis-marine-hydrokinetic-reference-models-preprint.\n""",",https://zenodo.org/badge/latestdoi/319377095\n\n,https://doi.org/10.5281/zenodo.4705961.\n\nIf,https://doi.org/10.2172/1324526","2020/12/07, 16:23:18",1052,BSD-3-Clause,39,303,"2023/09/13, 18:08:29",0,44,47,11,42,0,0.6,0.4380952380952381,"2022/11/17, 22:51:23",v0.0.11,0,7,false,,false,false,"pswild/king_pine,Eric-Musa/EnergyCapability,NREL/reV",,https://github.com/NREL,http://www.nrel.gov,"Golden, CO",,,https://avatars.githubusercontent.com/u/1906800?v=4,,,
welib,"Wind energy library, python and matlab tools for wind turbines analyses.",ebranlard,https://github.com/ebranlard/welib.git,github,,Wind Energy,"2023/06/16, 18:17:09",48,0,20,true,Python,,,"Python,MATLAB,Fortran,Jupyter Notebook,F*,Mathematica,Makefile,M",,"b'\xef\xbb\xbf[](https://github.com/ebranlard/welib/actions?query=workflow%3A%22Tests%22)\r\n
\r\n\r\n# Wind Energy Library - welib\r\n\r\nWind energy library: suite of python and matlab tools for aero-servo-hydro-elasticity (aerodynanmics, controls, hydrodynamics, structure/elasticity) and wind energy.\r\n\r\n# Installation and testing\r\nInstalling the latest release:\r\n```bash\r\npip install --upgrade welib\r\n```\r\nInstalling the latest dev version and running the unittests:\r\n```bash\r\ngit clone http://github.com/ebranlard/welib -b dev\r\ncd welib\r\npython -m pip install -r requirements.txt\r\npython -m pip install -e .\r\npytest\r\n```\r\n\r\n# Gallery of example scripts\r\n\r\nA sample of the figures generated by the examples in this repository are given below.\r\nAdditional examples can be found in the `examples` and `tests` folders of the different subpackages.\r\n\r\nClick on the links to access the corresponding scripts. \r\nClick on the figures to enlarge the figures.\r\n\r\n| | | | | |\r\n| :-------------------------: | :-------------------------: | :-------------------------: | :-------------------------: | :-------------------------: |\r\n| [Airfoils - 3D correction](/welib/airfoils/examples/correction3D.py) | [Airfoils - MGH dynamic stall model](/welib/airfoils/examples/dynamic_stall_mhh.py) | [Airfoils - Oye dynamic stall model](/welib/airfoils/examples/dynamic_stall_oye.py) | [Airfoils - Wagner function](/welib/airfoils/examples/wagner.py) | [Beam - Analytical mode shapes of a beam](/welib/beams/examples/Ex1_BeamModes.py) |\r\n|  |  |  |  |  |\r\n| [Beam - Analytical mode shapes different BC](/welib/beams/examples/Ex2_BeamModesAllBC.py) | [BEM Steady - High thrust correction](/welib/BEM/examples/Example_AxialInduction.py) | [BEM Steady - Performance curve](/welib/BEM/examples/Example_BEM_2.py) | [BEM Steady - CP-lambda-pitch ](/welib/BEM/examples/Example_BEM_CPLambdaPitch.py) | [BEM Theory - Ideal rotor planform](/welib/BEM/examples/Example_IdealRotor.py) |\r\n|  |  |  |  |  |\r\n| [BEM Unsteady - Prescribed surge motion](/welib/BEM/examples/Example_UnsteadyBEM_2_PrescribedMotion.py) | [Dynamic Inflow (Oye) - induction step](/welib/dyninflow/examples/Ex1_StepUp.py) | [FAST - interpolate radial time series](/welib/fast/examples/Example_RadialInterp.py) | [FAST - Average radial outputs](/welib/fast/examples/Example_RadialPostPro.py) | [FEM - mode shapes of tower](/welib/FEM/examples/Beam_ModeShapes_Tower.py) |\r\n|  | -InductionStep.png) |  |  |  |\r\n| [FEM - mode shapes of a beam](/welib/FEM/examples/Beam_ModeShapes_UniformBeamFrame3d.py) | [Hydro - Wave kinematics](/welib/hydro/examples/Ex1_WaveKinematics.py) | [Hydro - Jonswap spectrum](/welib/hydro/examples/Ex2_Jonswap_spectrum.py) | [Hydro - wave generation](/welib/hydro/examples/Ex3_WaveTimeSeries.py) | [Hydro - Morison loads on monopile](/welib/hydro/examples/Ex4_WaveLoads.py) |\r\n|  |  |  |  |  |\r\n| [Plot - 3D blades](/welib/plot/examples/Plot_3D_blades.py) | [IEC Standards - Turbulence classes](/welib/standards/examples/Ex1_TurbulenceClasses.py) | [IEC Standards - Extreme operating gusts](/welib/standards/examples/Ex2_EOG.py) | [System - Lorenz attractor](/welib/system/examples/Lorenz.py) | [System - 2nd order - forced vibrations](/welib/system/examples/MassSpringDamper_ForcedVibrations.py) |\r\n|  |  |  |  |  |\r\n| [System - LTI Bode plot - 2nd order mass spring damper](/welib/system/examples/MassSpringDamper_StateSpace_FreqDomain.py) | [System - 3D pendulum - motion](/welib/system/examples/pendulum_3d.py) | [System - 2nd order - Responses](/welib/system/examples/StepResponse.py) | [Signal - Correlation coefficient](/welib/tools/examples/ExampleCorrelation.py) | [Signal - FFT](/welib/tools/examples/Example_FFT.py) |\r\n|  |  |  |  |  |\r\n| [Vortilib - Elliptical Coordinates](/welib/vortilib/elements/examples/EllipticalCoordinates.py) | [Vortilib - Inviscid Vorticity Patch](/welib/vortilib/elements/examples/InviscidVortexPatch.py) | [Vortilib - Flow about an Ellipsoid](/welib/vortilib/elements/examples/SourceEllipsoid_Plots.py) | [Vortilib - Vortex helix lifting line velocity](/welib/vortilib/elements/examples/VortexHelix.py) | [Vortilib - Vortex particle regularization](/welib/vortilib/elements/examples/VortexParticle_Regularization.py) |\r\n|  |  |  |  |  |\r\n| [Vortilib - 2D vorticity patch discretized with vortex points](/welib/vortilib/elements/examples/VortexPoint2DDistribution.py) | [Vortilib - Vortex segment regularization](/welib/vortilib/elements/examples/VortexSegment_Regularization.py) | [Vortilib - Flow about an axisymmetric vorticity surface](/welib/vortilib/elements/examples/VortexSurfaceFlowField.py) | [Wind - wind generation at point](/welib/wind/examples/WindGenerationAtPoint.py) | [WT Theory - Wake Expansion Models](/welib/wt_theory/examples/WakeExpansion.py) |\r\n|  |  |  |  |  |\r\n| [WT Theory - Induced velocity vs Wake length](/welib/wt_theory/examples/WakeLengthInducedVelocity.py) | [PartDyn - Gravitational and spring interactions](/welib/yams/partdyn/examples/ThreePart_Gravitation.py) | [PartDyn - Gravitational interaction - Moon Orbit](/welib/yams/partdyn/examples/TwoPart_Orbits.py) | | |\r\n|  |  |  | | |\r\n\r\n# Examples of application\r\n\r\n\r\nYou can have a look at the example gallery below for direct links to examples and associated plots.\r\n\r\n- Aerodynamic applications (package `airfoils`, `BEM`):\r\n - Manipulation of airfoil curves, find slopes, interpolate (see [airfoils](welib/airfoils/examples/))\r\n - Run different dynamic stall models (e.g Oye or MHH/HGM model) (see [airfoils/DS](welib/airfoils/examples/))\r\n\r\n- Hydrodynamics applications (package `hydro`):\r\n - Wave kinematics for linear waves (see [hydro/Ex1](welib/hydro/examples/Ex1_WaveKinematics.py))\r\n - Generation of wave time series from a given spectrum (see [hydro/Ex3](welib/hydro/examples/Ex3_WaveTimeSeries.py))\r\n - Computation of wave loads on a monopile (see [hydro/Ex4](welib/hydro/examples/Ex4_WaveLoads.py))\r\n\r\n- Structural dynamics and system dynamics applications (packages `FEM`, `system`, `yams`):\r\n - Setup the equation of motions for a multibody system with flexible members analytically or numerically (see [yams](welib/yams/tests))\r\n - Linearize a non-linear system defined by a state and output equation (implicit or explicit) (see [system](welib/system/tests))\r\n - Perform 2d/3d FEM analyses using beam/frame elements (see [FEM](welib/FEM/examples))\r\n - Craig-Bampton / Guyan reduction of a structure (see [FEM](welib/FEM/examples))\r\n - Perform time integration of mechanical systems (see [system](welib/system/examples))\r\n\r\n- Controls applications (packages `ctrl`, `kalman`):\r\n - Run a kalman filter to estimate states of a system (see [kalman](welib/kalman/))\r\n\r\n- Wind energy applications:\r\n - Run steady state BEM simulations (see [BEM/steady 1-2](welib/BEM/examples)\r\n - Run unsteady BEM simulations (see [BEM/unsteady 1-2](welib/BEM/examples/)\r\n - Read and write common wind energy file formats (see [weio](welib/weio), a clone of [weio](http://github.com/ebranlard/weio/))\r\n - Generate stochastic wind and [waves](welib/hydro/examples/Ex3_WaveTimeSeries.py) times series\r\n - Estimate wind speed (see \'welib\\ws\\_estimator`))\r\n - Theory of optimal circulation\r\n - Standards\r\n\r\n- Other (packages `tools`, `ode`):\r\n - Spectral analyses, signal processing, time integration, vector analyses\r\n\r\nSee also:\r\n\r\n- [pyDatView](http://github.com/ebranlard/pyDatView/): GUI to visualize files (supported by weio) and perform analyses on the data\r\n\r\n\r\n\r\n\r\n# Libraries\r\n\r\nThe repository contains a set of small packages, for aerodynamics, structure, control and more:\r\n\r\n- airfoils: polar manipulations, dynamic stall models\r\n- beams: analytical results for beams\r\n- BEM: steady and unsteady bem code\r\n- ctrl: control tools\r\n- dyninflow: dynamic inflow models\r\n- fastlib: tools to handle OpenFAST models (run simulations, postprocess, linear model)\r\n- FEM: Finite Element Method tools (beams)\r\n- hydro: hydrodynamic tools\r\n- kalman: kalman filter\r\n- mesh: meshing tools\r\n- ode: tools for time integration of ODE\r\n- standards: some formulae and scripts useful for the IEC standards\r\n- system: tools for dynamic systems (e.g. LTI, state space) and mechanical systems (M,C,K matrices), eigenvalue analysis, time integration\r\n- tools: mathematical tools, signal processing\r\n- weio: library to read and write files used in wind energy, clone of [weio](http://github.com/ebranlard/weio/) \r\n- wt\\_theory: scripts implementing some wind turbine aerodynamic theory \r\n- ws\\_estimator: wind speed estimator for wind energy based on tabulated Cp Ct\r\n- yams: multibody analyses\r\n\r\n\r\n# References and how to cite\r\nIf you find some of this repository useful and use it in your research, thank you for using the following citations.\r\n\r\n - General wind turbine scripts and aerodynamics:\r\n```bibtex\r\n@book{Branlard:book,\r\n author = {E. Branlard},\r\n title = {Wind Turbine Aerodynamics and Vorticity-Based Methods: Fundamentals and Recent Applications},\r\n year = {2017},\r\n publisher= {Springer International Publishing},\r\n doi={10.1007/978-3-319-55164-7},\r\n isbn={ 978-3-319-55163-0}\r\n}\r\n```\r\n - Structural dynamics:\r\n```bibtex\r\n@article{Branlard:2019,\r\n title = {{Flexible multibody dynamics using joint coordinates and the Rayleigh-Ritz approximation: The general framework behind and beyond Flex}},\r\n author = {E. Branlard},\r\n journal = {Wind Energy},\r\n volume = {22},\r\n number = {7},\r\n pages = {877-893},\r\n year = {2019},\r\n doi = {10.1002/we.2327}\r\n}\r\n```\r\n\r\n\r\n\r\n\r\n\r\n# Contributing\r\nAny contributions to this project are welcome! If you find this project useful, you can also buy me a coffee (donate a small amount) with the link below:\r\n\r\n\r\n
\r\n'",,"2018/10/17, 22:47:46",1833,MIT,119,497,"2023/10/06, 17:42:40",0,5,6,5,19,0,0.0,0.0,,,0,1,false,,false,false,,,,,,,,,,,
digital_wra_data_standard,This standard data model and associated tools are intended as universal building blocks for Wind Energy Resource Assessment applications.,IEA-Task-43,https://github.com/IEA-Task-43/digital_wra_data_standard.git,github,,Wind Energy,"2023/10/20, 15:04:58",48,0,14,true,Jupyter Notebook,IEA Wind Digitalization (Task 43),IEA-Task-43,"Jupyter Notebook,TypeScript,JavaScript,HTML,Python,CSS",,"b'\n
\n\n[](https://github.com/IEA-Task-43/awesome-wind)\n\n# Digital WRA Data Standards\n\nThis repository is for the work, carried out by IEA Wind Task 43, on standardizing digital tools in wind resource \nassessment (WRA). It currently consists of data models for wind resource measurements (_WRA Data Model_) and \ndigital calibration certificates. Along with associated tools, these are intended as universal building blocks for \nwind energy yield assessment applications.\n\n## Mission\n_**""Our mission is to make the energy yield assessment process more efficient, transparent and reproducible \nthrough digitizing and automation.""**_\n\n## What is a ""Data Model""?\nA data model is an abstract model of real-world entities that organizes elements of data and standardizes how they \nrelate to one another. In this instance, the _WRA Data Model_ standardizes how properties of a wind resource measurement station (e.g. \nlatitude, longitude, anemometer serial number, installation height, logger slope, logger offset, etc.) are recorded and \nhow all the properties relate to each other. An implementation of the data model therefore describes how a specific met mast was \ninstalled, how the sensors were mounted on that met mast, how these sensors were programmed into the logger and how \nall these properties may have changed over time.\n\nFor more information on the definition of a Data Model and other terminology such as Schema, please see the \n[Task 43 Glossary](https://iea-task-43.gitbook.io/iea-task-43-glossary/terms/data-model).\n\n
\n\n# The WRA Data Model\n\nThe _WRA Data Model_ provides the instructions for how to digitally represent the configuration of an installed met mast, lidar, sodar, \nfloating lidar or solar measurement station. An implementation of the data model therefore can contain the latitude and \nlongitude of where the met mast was installed. At what height is a wind speed measurement been made and by what \nsensor. How is this sensor mounted onto the mast. How the sensor is connected to the logger and how this logger channel\nis programmed (i.e. what slope and offset values are programmed into the logger). And how all these properties \nmay have changed over time.\n\nTo learn more about the _WRA Data Model_, please read Amit Bohara\'s \n[introduction](https://github.com/IEA-Task-43/digital_wra_data_standard/wiki/Task-43-WRA-Data-Model---An-introduction)\nin the Wiki section of this GitHub repository.\n\nJSON ([JavaScript Object Notation](https://www.json.org/)) is used to implement the _WRA Data Model_ and \n[JSON Schema](https://json-schema.org/) is used to express the _WRA Data Model_. The JSON Schema file, located at \n[./schema/iea43_wra_data_model.schema.json](./schema/iea43_wra_data_model.schema.json), is the \n_WRA Data Model_. It can be thought of as a blueprint for how an implementation of the data model can be described with \nthe JSON data-interchange format. See Figure 1 below for a snippet of an example implementation.\n\n\n
\n_Figure 1: Example implementation of the WRA Data Model._\n\nThe following tools are part of this undertaking:\n\n- [WRA Data Model](./schema/iea43_wra_data_model.schema.json): This JSON Schema file is the _WRA Data Model_. It describes \n how a JSON file that describes wind resource measurement data should be composed.\n\n- [Documentation](./docs/README.md): Markdown documentation for the _WRA Data Model_ created directly from the JSON Schema.\n\n- [Form App](https://iea-task-43.github.io/digital_wra_data_standard/): This app shows a form that is modeled after the \n JSON Schema and can create JSON data out of your inputs that is in accordance with the _WRA Data Model_.\n\n- [Python Data Model Loading Example](./tools/load_demo_schema.ipynb): This notebook shows how to read an example file that \n uses the _WRA Data Model_ with Python. [](https://mybinder.org/v2/gh/IEA-Task-43/digital_wra_data_standard/master?filepath=.%2Ftools%2Fload_demo_schema.ipynb)\n\n## Dissemination\nThe Task 43 team is actively working to disseminate information and improve user experience. \nThe [Wiki](https://github.com/IEA-Task-43/digital_wra_data_standard/wiki) tab in this GitHub repository is the starting \npoint to learn more via tutorials, recorded presentations, videos and more.\n\n- [Wiki](https://github.com/IEA-Task-43/digital_wra_data_standard/wiki)\n\n
\n\n# The Digital Calibration Certificate Data Model\n\nAs a requirement resulting from the development of the _WRA Data Model_, a \n[digital calibration certificate](./digital_calibration_certificate) is introduced as a complementary standard. This \nstandard currently supports anemometer calibration certificates according to IEC 61400-12-1:2017. \n\n
\n\n# Contributing to the Standard\nWe welcome all contributions including issue reporting, new features and bug fixes. For full details see the contributing \nguidelines and other resources below:\n\n- [Contribution Guidelines](./contributing.md)\n- [Issue Log](https://github.com/IEA-Task-43/digital_wra_data_standard/issues) where you can ask for new features or inform us of any bugs.\n- [Kanban board](https://github.com/IEA-Task-43/digital_wra_data_standard/projects/1) showing what issues are been worked on, completed or yet to do.\n- [Meeting Notes](https://github.com/IEA-Task-43/digital_wra_data_standard/discussions/129#discussion-3748501) of our regular meetings.\n\n# Getting Help\n\nPlease file a new issue in this repository with questions or concerns you might have. If you would like to chat directly with the contributors, please join our Slack channel at [ieawinddigitalization.slack.com](https://ieawinddigitalization.slack.com/).\n\nYou can find out more about the IEA\'s Wind Task 43 working group at [ieawindtask43.org](https://www.ieawindtask43.org/work-package-4-digital-wra).\n\n\n## Pipeline Status\n| Pipeline | Status | Result |\n|:---------|:-------|:-------|\n| Documentation |  | [Documentation](./docs/README.md) |\n| Form App |  | [Form App](https://iea-task-43.github.io/digital_wra_data_standard/) |\n'",,"2019/12/03, 21:26:02",1421,BSD-3-Clause,80,581,"2023/10/20, 15:12:11",37,96,194,37,5,0,0.6,0.4004576659038902,"2023/01/27, 14:41:09",v1.2.0-2023.01,0,13,false,,false,true,,,https://github.com/IEA-Task-43,www.ieawindtask43.org,,,,https://avatars.githubusercontent.com/u/57503085?v=4,,,
awebox,Modelling and optimal control of single- and multiple-kite systems for airborne wind energy.,awebox,https://github.com/awebox/awebox.git,github,,Wind Energy,"2023/08/15, 10:50:59",19,0,2,true,Python,,awebox,"Python,Shell",,"b'# awebox\n\n`awebox` is a Python toolbox for modelling and optimal control of multiple-kite systems for Airborne Wind Energy (AWE). It provides interfaces that aim to take away from the user the burden of\n\n* generating optimization-friendly system dynamics for different combinations of modeling options.\n* formulating optimal control problems for common multi-kite trajectory types.\n* solving the trajectory optimization problem reliably\n* postprocessing and visualizing the solution and performing quality checks \n* tracking MPC design and handling for offline closed-loop simulations\n\nThe main focus of the toolbox are _rigid-wing_, _lift_- and _drag_-mode multiple-kite systems.\n\n## Installation\n\n`awebox` runs on Python 3. It depends heavily on the modeling language CasADi, which is a symbolic framework for algorithmic differentiation. CasADi also provides the interface to the NLP solver IPOPT. \nIt is optional but highly recommended to use HSL linear solvers as a plugin with IPOPT.\n\n1. Get a local copy of the latest `awebox` release:\n\n ```\n git clone https://github.com/awebox/awebox.git\n ```\n\n2. Install using pip\n\n ```\n pip3 install awebox/\n ```\n\n3. In order to get the HSL solvers and render them visible to CasADi, follow these [instructions](https://github.com/casadi/casadi/wiki/Obtaining-HSL). Additional installation instructions can be found [here](https://github.com/awebox/awebox/blob/develop/INSTALLATION.md).\n\n\n## Getting started\n\nTo run one of the examples from the `awebox` root folder:\n\n```\npython3 examples/single_kite_lift_mode_simple.py\n```\n\n## Acknowledgments\n\nThis software has been developed in collaboration with the company [Kiteswarms Ltd](http://www.kiteswarms.com). The company has also supported the project through research funding.\n\nThis project has received funding from the European Union\xe2\x80\x99s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement No 642682 (_AWESCO_)\n\n## Citing `awebox`\nPlease use the following citation: \n\n""_awebox: Modelling and optimal control of single- and multiple-kite systems for airborne wind energy. https://github.com/awebox/awebox_""\n\n## Literature\n\n### `awebox`-based research\n\n[Optimal Control of Stacked Multi-Kite Systems for Utility-Scale Airborne Wind Energy](https://cdn.syscop.de/publications/DeSchutter2019.pdf) \\\nDe Schutter et al. / IEEE Conference on Decision and Control (CDC) 2019\n\n[Wake Characteristics of Pumping Mode Airborne Wind Energy Systems](https://cdn.syscop.de/publications/Haas2019.pdf) \\\nHaas et al. / Journal of Physics: Conference Series 2019\n\n[Operational Regions of a Multi-Kite AWE System](https://cdn.syscop.de/publications/Leuthold2018.pdf) \\\nLeuthold et al. / European Control Conference (ECC) 2018\n\n[Optimal Control for Multi-Kite Emergency Trajectories](https://cdn.syscop.de/publications/Bronnenmeyer2018.pdf) \\\nBronnenmeyer (Masters thesis) / University of Stuttgart 2018\n\n### Models\n\n**Induction models**\\\n[Engineering Wake Induction Model For Axisymmetric Multi-Kite Systems](https://www.researchgate.net/publication/334616920_Engineering_Wake_Induction_Model_For_Axisymmetric_Multi-Kite_Systems) \\\nLeuthold et al. / Wake Conference 2019\n\n**Point-mass model**\\\n[Airborne Wind Energy Based on Dual Airfoils](https://cdn.syscop.de/publications/Zanon2013a.pdf) \\\nZanon et al. / IEEE Transactions on Control Systems Technology 2013\n\n### Methods\n\n**Homotopy strategy** \\\n[A Relaxation Strategy for the Optimization of Airborne Wind Energy Systems](https://cdn.syscop.de/publications/Gros2013a.pdf) \\\nGros et al. / European Control Conference (ECC) 2013\n\n**Trajectory optimization** \\\n[Numerical Trajectory Optimization for Airborne Wind Energy Systems Described by High Fidelity Aircraft Models](https://cdn.syscop.de/publications/Horn2013.pdf) \\\nHorn et al. / Airborne Wind Energy 2013\n\n### Software\n\n**IPOPT**\\\n[On the Implementation of a Primal-Dual Interior Point Filter Line Search Algorithm for Large-Scale Nonlinear Programming](http://cepac.cheme.cmu.edu/pasilectures/biegler/ipopt.pdf) \\\nW\xc3\xa4chter et al. / Mathematical Programming 106 (2006) 25-57\n\n**CasADi**\\\n[CasADi - A software framework for nonlinear optimization and optimal control](https://cdn.syscop.de/publications/Andersson2018.pdf) \\\nAndersson et al. / Mathematical Programming Computation 2018\n'",,"2019/04/26, 09:12:42",1643,LGPL-3.0,11,1105,"2023/08/15, 10:50:59",17,96,114,6,71,9,0.0,0.37368421052631584,,,0,5,false,,false,true,,,https://github.com/awebox,,,,,https://avatars.githubusercontent.com/u/49553965?v=4,,,
CCBlade.jl,A blade element momentum method for propellers and turbines.,byuflowlab,https://github.com/byuflowlab/CCBlade.jl.git,github,"rotor,propeller,wind-turbine,bem,bemt,aerodynamics,aircraft,rotorcraft",Wind Energy,"2023/08/07, 23:03:20",44,0,12,true,Python,BYU FLOW Lab,byuflowlab,"Python,Julia",,"b'# CCBlade.jl\n\n[](https://flow.byu.edu/CCBlade.jl/stable)\n\n\n\n**Summary**: A blade element momentum method for propellers and turbines. \n\n**Author**: Andrew Ning\n\n**Features**:\n\n- Methodology is provably convergent (see although multiple improvements have been made since then)\n- Prandtl hub/tip losses (or user-defined losses)\n- Glauert/Buhl empirical region for high thrust turbines\n- Convenience functions for inflow with shear, precone, yaw, tilt, and azimuth\n- Can do airfoil corrections beforehand or on the fly (Mach, Reynolds, rotation)\n- Allows for flow reversals (negative inflow/rotation velocities)\n- Allows for a hover condition (only rotation, no inflow) and rotor locked (no rotation, only inflow)\n- Compatible with AD tools like ForwardDiff\n\n**Installation**:\n\n```julia\n] add CCBlade\n```\n\n**Documentation**:\n\nThe [documentation](https://flow.byu.edu/CCBlade.jl/stable/) contains\n- A quick start tutorial to learn basic usage,\n- Guided examples to address specific or more advanced tasks,\n- A reference describing the API,\n- Theory in full detail.\n\n**Run Unit Tests**:\n\n```julia\npkg> activate .\npkg> test\n```\n\n**Citing**:\n\nNing, A., \xe2\x80\x9cUsing Blade Element Momentum Methods with Gradient-Based Design Optimization,\xe2\x80\x9d Structural and Multidisciplinary Optimization, Vol. 64, No. 2, pp. 994\xe2\x80\x931014, May 2021. doi:10.1007/s00158-021-02883-6\n\n**Python / OpenMDAO users**\n\nIn the `openmdao` folder there is a Python wrapper to this package to enable usage from [OpenMDAO](https://openmdao.org). This wrapper was developed/maintained by Daniel Ingraham and Justin Gray at NASA Glenn.\n'",,"2016/05/18, 19:33:38",2716,CUSTOM,16,254,"2023/08/07, 23:26:29",4,7,23,8,78,2,0.14285714285714285,0.3883928571428571,"2023/08/08, 20:08:40",v0.2.5,0,7,false,,false,false,,,https://github.com/byuflowlab,http://flow.byu.edu,"Provo, UT",,,https://avatars.githubusercontent.com/u/10734941?v=4,,,
lidarwind,Retrieve wind speed and direction profiles from Doppler lidar observations from the WindCube-200s.,jdiasn,https://github.com/jdiasn/lidarwind.git,github,,Wind Energy,"2023/06/10, 14:07:57",9,0,4,true,Python,,,"Python,TeX,Makefile",,"b""======================\nlidarwind introduction\n======================\n\n.. image:: https://joss.theoj.org/papers/28430a0c6a79e6d1ff33579ff13458f7/status.svg\n :target: https://doi.org/10.21105/joss.04852\n\n.. image:: https://zenodo.org/badge/DOI/10.5281/zenodo.7026548.svg\n :target: https://doi.org/10.5281/zenodo.7026548\n\n.. image:: https://readthedocs.org/projects/lidarwind/badge/?version=latest\n :target: https://lidarwind.readthedocs.io/en/latest/?badge=latest\n :alt: Documentation Status\n\n.. image:: https://mybinder.org/badge_logo.svg\n :target: https://mybinder.org/v2/gh/jdiasn/lidarwind/main?labpath=docs%2Fexamples\n\n.. image:: https://img.shields.io/pypi/v/lidarwind.svg\n :target: https://pypi.python.org/pypi/lidarwind/\n\n.. image:: https://codecov.io/gh/jdiasn/lidarwind/branch/main/graph/badge.svg?token=CEZM17YY3I\n :target: https://codecov.io/gh/jdiasn/lidarwind\n\nlidarwind is an open-source Python project to retrieve wind speed and direction profiles from Doppler lidar observations from the WindCube-200s, and it was developed to be easy to use. It can retrieve wind profiles from the 6-beam and DBS scanning strategies and allow users to set the signal-to-noise ratio threshold to reduce the noise. It also calculates the Reynolds stress tensor matrix elements from the 6-beam observations.\n\nlidarwind is a result of an effort to create an environment where it would be flexible and easy to process the observations from the WindCube Doppler lidar. Its development started in 2021 when I had to retrieve wind profiles from the 6-beam observations.\n\nThis current version focuses on the WindCube's observations, and the wind retrievals are dedicated to the 6-beam and DBS observations. However, it can be expanded to other Doppler lidar observations and scanning strategies.\n\n\n-------------\nDocumentation\n-------------\n\nThe lidarwind's documentation is available at https://lidarwind.readthedocs.io. There you can find the steps needed for installing the package. You can also find a short description of how the lidar wind derives the wind speed and direction from WindCube's observations.\n\n\nNotebooks\n=========\n\nAn introductory set of rendered notebooks are available at https://nbviewer.org/github/jdiasn/lidarwind/tree/main/docs/examples/ or at https://github.com/jdiasn/lidarwind/tree/main/docs/examples. If you want to try the package without installing it locally, click on the binder badge above. You will be redirected to a virtual environment where you can also access the same notebooks and test the package.\n\n.. warning::\n\n Beware that between versions 0.1.6 and 0.2.0, the package underwent significant refactoring. Now the classes' names\n follow the Pascal case, while module names, functions and attributes follow the snake case. Codes developed using the previous\n version will need revision.\n \n--------\nCitation\n--------\n\nIf you use lidarwind, or replicate part of it, in your work/package, please consider including the reference:\n\nNeto, J. D. and Castel\xc3\xa3o, G. P., (2023). lidarwind: A Python package for retrieving wind profiles from Doppler lidar observations. Journal of Open Source Software, 8(82), 4852, https://doi.org/10.21105/joss.04852\n\n::\n\n @article{Neto2023,\n doi = {10.21105/joss.04852},\n url = {https://doi.org/10.21105/joss.04852},\n year = {2023}, publisher = {The Open Journal},\n volume = {8}, number = {82}, pages = {4852},\n author = {Jos\xc3\xa9 Dias Neto and Guilherme P. Castelao},\n title = {lidarwind: A Python package for retrieving wind profiles from Doppler lidar observations},\n journal = {Journal of Open Source Software}\n }""",",https://doi.org/10.21105/joss.04852\n\n,https://doi.org/10.5281/zenodo.7026548\n\n,https://doi.org/10.21105/joss.04852\n\n::\n\n,https://doi.org/10.21105/joss.04852","2021/04/08, 13:20:14",930,CUSTOM,55,163,"2023/06/16, 13:36:12",2,98,106,62,131,0,0.1,0.10457516339869277,"2023/01/31, 21:42:27",v0.2.2,0,2,false,,false,true,,,,,,,,,,,
FLOWUnsteady,An interactional aerodynamics and acoustics solver for multirotor aircraft and wind energy.,byuflowlab,https://github.com/byuflowlab/FLOWUnsteady.git,github,"aerodynamics,aircraft,cfd,vpm,vortex-methods,acoustics,aeroacoustics,computational-fluid-dynamics,rotorcraft,vtol,wind-turbine",Wind Energy,"2023/04/20, 22:07:38",192,0,91,true,Julia,BYU FLOW Lab,byuflowlab,Julia,,"b'
\n\n\n \n Interactional aerodynamics solver for multirotor aircraft and wind energy\n \n
\n\n\n \n
\n \n \n
\n \n
\n\n---\n\nFLOWUnsteady is an open-source variable-fidelity framework for unsteady\naerodynamics and aeroacoustics based on the\n[reformulated vortex particle method](https://scholarsarchive.byu.edu/etd/9589/)\n(rVPM).\nThis suite brings together various tools developed by the\n[FLOW Lab](http://flow.byu.edu/) at Brigham Young University: Vortex lattice\nmethod, strip theory, blade elements, 3D panel method, and rVPM.\nThe suite also integrates an FW-H solver and a BPM code for tonal\nand broadband prediction of aeroacoustic noise.\nIn the low end of fidelity, simulations are similar to a free-wake method,\nwhile in the high end simulations become meshless large eddy simulations.\n\n\n* *Documentation:* [flow.byu.edu/FLOWUnsteady](https://flow.byu.edu/FLOWUnsteady)\n* *Code:* [github.com/byuflowlab/FLOWUnsteady](https://github.com/byuflowlab/FLOWUnsteady)\n\n### What is the Reformulated VPM?\n\nThe [reformulated VPM](https://scholarsarchive.byu.edu/etd/9589/) is a meshless\nCFD method solving the LES-filtered incompressible Navier-Stokes equations in\ntheir vorticity form,\n\n
\n
\nIt uses a Lagrangian (meshless) scheme, which not only\navoids the hurdles of mesh generation, but it also conserves vortical structures\nover long distances with minimal numerical dissipation.\n\nThe rVPM uses particles to discretize the Navier-Stokes equations, with the\nparticles representing radial basis functions that construct a continuous\nvorticity/velocity field. The basis functions become the LES filter, providing a\nvariable filter width and spatial adaption as the particles are convected and\nstretched by the velocity field. The local evolution of the filter width\nprovides an extra degree of freedom to reinforce conservations laws, which makes\nthe reformulated VPM numerically stable (overcoming the numerical issues that\nplague the classic VPM).\n\nThis meshless LES has several advantages over conventional mesh-based CFD.\nIn the absence of a mesh, \n1. the rVPM does not suffer from the numerical dissipation introduced by a mesh\n2. integrates over coarser discretizations without losing physical accuracy\n3. derivatives are calculated analytically rather than approximated through a stencil.\n\nFurthermore, rVPM is highly efficient since it uses computational elements only\nwhere there is vorticity (rather than meshing the entire space), making it 100x\nfaster than conventional mesh-based LES with comparable accuracy.\n\n\nWhile rVPM is well suited for resolving unbounded flows (wakes), complications\narise when attempting to impose boundary conditions (solid boundaries) on the flow.\nThis is because (1) the method is meshless, and (2) boundary conditions must\nbe imposed on the Navier-Stokes equations in the form of vorticity.\nFLOWUnsteady is a framework designed to introduce solid boundaries into the rVPM\nusing actuator models.\nWings and rotors are introduced in the computational domain through actuator\nline and surface models that use low-fidelity aerodynamic methods\n(*e.g.*, VLM, lifting line,\npanels, etc) to compute forces and embed the associated\nvorticity back into the LES domain.\n\n\n
\n\n\n
\n\n\n### Variable Fidelity for Preliminary-to-Detailed Design\n\nrVPM considerably reduces engineering time by avoiding the hurdles of mesh\ngeneration. Furthermore, since it is not limited by the time-step and stability\nconstraints of conventional mesh-based CFD, rVPM can be used across all levels\nof fidelity, all in the same framework by simply coarsening or refining the\nsimulation.\nIn the low end of fidelity, simulations are similar to a free-wake method,\nwhile in the high end simulations become meshless large eddy simulations.\nThus, FLOWUnsteady can be used as a high-fidelity tool that is orders of\nmagnitude faster than mesh-based CFD, or as a variable-fidelity tool for\nthe different stages of design.\n\n\n
\n
\n\n### Capabilities\n\n > **Simulation:**\n > *Tilting wings and rotors*\n > *\xe2\x80\xa2 Rotors with variable RPM and variable pitch*\n > *\xe2\x80\xa2 Asymmetric and stacked rotors*\n > *\xe2\x80\xa2 Maneuvering vehicle with prescribed kinematics*\n >\n > **rVPM Solver:**\n > *Fast-multipole acceleration through [ExaFMM](https://joss.theoj.org/papers/10.21105/joss.03145)*\n > *\xe2\x80\xa2 CPU parallelization through OpenMP*\n > *\xe2\x80\xa2 Second-order spatial accuracy and third-order RK time integration*\n > *\xe2\x80\xa2 Numerically stable by reshaping particles subject to vortex stretching*\n > *\xe2\x80\xa2 Subfilter-scale (SFS) model of turbulence associated to vortex stretching*\n > *\xe2\x80\xa2 SFS model coefficient computed dynamically or prescribed*\n > *\xe2\x80\xa2 Viscous diffusion through core spreading*\n >\n > **Wing Models:**\n > *Actuator line model through lifting line + VLM*\n > *\xe2\x80\xa2 Actuator surface model through vortex sheet + VLM*\n > *\xe2\x80\xa2 Parasitic drag through airfoil lookup tables*\n >\n > **Rotor Model:**\n > *Actuator line model through blade elements*\n > *\xe2\x80\xa2 Airfoil lookup tables automatically generated through XFOIL*\n > *\xe2\x80\xa2 Aeroacoustic noise through FW-H (PSU-WOPWOP) and BPM*\n >\n > **Under development *(*\xf0\x9f\xa4\x9e*coming soon)*:**\n > *Advanced actuator surface models through 3D panel method (for ducts, wings, and fuselage)*\n > *\xe2\x80\xa2 Bluff bodies through vortex sheet method*\n >\n > **Limitations:**\n > *Viscous drag and separation is only captured through airfoil lookup tables, without attempting to shed separation wakes*\n > *\xe2\x80\xa2 Incompressible flow only (though wave drag can be captured through airfoil lookup tables)*\n > *\xe2\x80\xa2 CPU parallelization through OpenMP without support for distributed memory (no MPI, i.e., only single-node* runs)\n >\n > *Coded in [the Julia language](https://www.infoworld.com/article/3284380/what-is-julia-a-fresh-approach-to-numerical-computing.html) for Linux, MacOS, and Windows WSL.*\n\n\n\n\n\nMore about the models inside FLOWUnsteady:\n\n \n
\n \n
\n\n
\n\n### Selected Publications\nSee the following publications for an in-depth dive into the theory and validation:\n\n* E. J. Alvarez, J. Mehr, & A. Ning (2022), ""FLOWUnsteady: An Interactional Aerodynamics Solver for Multirotor Aircraft and Wind Energy,"" *AIAA AVIATION Forum*. [**[VIDEO]**](https://youtu.be/SFW2X8Lbsdw) [**[PDF]**](https://scholarsarchive.byu.edu/facpub/5830/)\n* E. J. Alvarez & A. Ning (2022), ""Reviving the Vortex Particle Method: A Stable Formulation for Meshless Large Eddy Simulation,"" *(in review)*. [**[PDF]**](https://arxiv.org/pdf/2206.03658.pdf)\n* E. J. Alvarez (2022), ""Reformulated Vortex Particle Method and Meshless Large Eddy Simulation of Multirotor Aircraft.,"" *Doctoral Dissertation, Brigham Young University*. [**[VIDEO]**](https://www.nas.nasa.gov/pubs/ams/2022/08-09-22.html) [**[PDF]**](https://scholarsarchive.byu.edu/etd/9589/)\n\n
\n\n### Examples\n\n**Propeller:** [[Tutorial](https://flow.byu.edu/FLOWUnsteady/examples/propeller-J040)] [[Validation](https://flow.byu.edu/FLOWUnsteady/theory/validation/#Propeller)]\n\n
\n\n\n**Rotor in Hover:** [[Tutorial](https://flow.byu.edu/FLOWUnsteady/examples/rotorhover-aero)] [[Validation](https://flow.byu.edu/FLOWUnsteady/theory/validation/#Rotor)]\n\n
\n\n\n**Blown Wing:** [[Tutorial](https://flow.byu.edu/FLOWUnsteady/examples/blownwing-aero)] [[Validation](https://flow.byu.edu/FLOWUnsteady/theory/validation/#Rotor-Wing-Interactions)]\n\n\n
\n
\n\n
\n\n**Airborne-Wind-Energy Aircraft:** [[Video](https://www.youtube.com/watch?v=iFM3B4_N2Ls)]\n\n\n
\n
\n\n\n**eVTOL Transition:** [[Tutorial](https://flow.byu.edu/FLOWUnsteady/examples/vahana-vehicle/)]\n\nMid-fidelity\n
\n\nHigh-fidelity\n
\n\n\n**Aeroacoustic Noise:** [[Tutorial](https://flow.byu.edu/FLOWUnsteady/examples/rotorhover-acoustics)] [[Validation](https://flow.byu.edu/FLOWUnsteady/theory/validation/#Rotor)]\n\n\n
\n
\n\n
\n\n\n\n\n### Sponsors\n\n\n
\n
\n
\n\n\n\n### About\n\nFLOWUnsteady is an open-source project jointly led by the\n[FLOW Lab](http://flow.byu.edu/) at Brigham Young University and\n[Whisper Aero](http://whisper.aero/).\nAll contributions are welcome.\n\nIf you find FLOWUnsteady useful in your work, we kindly request that you cite the following paper [[URL]](https://arc.aiaa.org/doi/10.2514/6.2022-3218) [[PDF]](https://scholarsarchive.byu.edu/cgi/viewcontent.cgi?article=6735&context=facpub):\n\n>Alvarez, E. J., Mehr, J., and Ning, A., \xe2\x80\x9cFLOWUnsteady: An Interactional Aerodynamics Solver for Multirotor Aircraft and Wind Energy,\xe2\x80\x9d AIAA AVIATION 2022 Forum, Chicago, IL, 2022. DOI:[10.2514/6.2022-3218](https://doi.org/10.2514/6.2022-3218).\n\nIf you were to encounter any issues, please first read through\n[the documentation](https://flow.byu.edu/FLOWUnsteady/) and [open/closed\nissues](https://github.com/byuflowlab/FLOWUnsteady/issues?q=is%3Aissue+is%3Aclosed).\nIf the issue still persists, please\n[open a new issue](https://github.com/byuflowlab/FLOWUnsteady/issues).\n\n * Main developer : Eduardo J. Alvarez ([edoalvarez.com](https://www.edoalvarez.com/))\n * Created : Sep 2017\n * License : MIT License\n'",",https://arxiv.org/pdf/2206.03658.pdf,https://doi.org/10.2514/6.2022-3218","2019/09/30, 19:55:38",1486,MIT,102,767,"2023/10/20, 17:21:04",17,13,60,31,5,6,0.0,0.3287292817679558,"2023/04/20, 22:12:56",v3.2.1,0,8,false,,false,false,,,https://github.com/byuflowlab,http://flow.byu.edu,"Provo, UT",,,https://avatars.githubusercontent.com/u/10734941?v=4,,,
stochLAB,A tool to run collision risk models for seabirds on offshore wind farms.,HiDef-Aerial-Surveying,https://github.com/HiDef-Aerial-Surveying/stochLAB.git,github,"collision-risk,offshore-wind,seabirds,migratoryspecies",Wind Energy,"2023/03/10, 09:46:11",6,0,5,true,R,HiDef Aerial Surveying,HiDef-Aerial-Surveying,R,https://hidef-aerial-surveying.github.io/stochLAB/,"b'\n\n\n# stochLAB
\n\n\n\n[](https://github.com/HiDef-Aerial-Surveying/stochLAB/actions/workflows/test-coverage.yaml)\n[](https://github.com/HiDef-Aerial-Surveying/stochLAB/actions/workflows/pkgdown.yaml)\n[](https://github.com/HiDef-Aerial-Surveying/stochLAB/actions/workflows/R-CMD-check.yaml)\n[](https://github.com/HiDef-Aerial-Surveying/stochLAB/actions/workflows/pkgcheck.yaml)\n[](https://github.com/ropensci/software-review/issues/551)\n\n\n\n`{stochLAB}` is a tool to run Collision Risk Models (CRMs) for seabirds\non offshore wind farms.\n\n## Overview\n\nThe `{stochLAB}` package is an adaptation of the [R\ncode](https://data.marine.gov.scot/dataset/developing-avian-collision-risk-model-incorporate-variability-and-uncertainty-r-code-0)\ndeveloped by [Masden\n(2015)](https://data.marine.gov.scot/dataset/developing-avian-collision-risk-model-incorporate-variability-and-uncertainty)\nto incorporate variability and uncertainty in the avian collision risk\nmodel originally developed by [Band\n(2012)](https://www.bto.org/sites/default/files/u28/downloads/Projects/Final_Report_SOSS02_Band1ModelGuidance.pdf).\nThe package is for use by individuals modelling collision risk of\nseabirds at offshore wind farms. The primary functions take input\ninformation on the morphology, behaviour and densities of seabirds as\nwell data pertaining to the proposed wind farm (i.e., turbine\ndimensions, speed and number).\n\nThese collision risk models are useful for marine ornithologists who are\nworking in the offshore wind industry, particularly in UK waters.\nHowever, the package itself relies on generic biological and windfarm\ndata and can be applied anywhere (i.e., in any marine environment) as\nlong as the parameters are appropriate for the species and windfarms of\ninterest.\n\nCode developed under `{stochLAB}` substantially re-factored and\nre-structured Masden\xe2\x80\x99s (heavily script-based) implementation into a\nuser-friendly, streamlined, well documented and easily distributed tool.\nFurthermore, the package lays down the code infrastructure for easier\nincorporation of new functionality, e.g.\xc2\xa0extra parameter sampling\nfeatures, model expansions, etc.\n\nIn addition, previous code underpinning core calculations for the\nextended model has been replaced by an alternative approach, resulting\nin significant gains in computational speed over Masden\xe2\x80\x99s code. This\noptimization is particularly beneficial under a stochastic context, when\ncore calculations are called repeatedly during simulations.\n\nFor a more detailed overview type `?stochLAB`, once installed!\n\n## Installation\n\nYou can install the released version of stochLAB from\n[CRAN](https://CRAN.R-project.org) with:\n\n``` r\ninstall.packages(""stochLAB"")\n```\n\nYou can install the development version with:\n\n``` r\n# install.packages(""devtools"")\ndevtools::install_github(""HiDef-Aerial-Surveying/stochLAB"")\n```\n\nThis package depends on the following packages, which should be\ninstalled automatically:\n\n`cli dplyr glue logr magrittr msm pracma purrr rlang stats tibble tidyr`\n\n## Bug reports\n\nTo report any bugs, please log an\n[ISSUE](https://github.com/HiDef-Aerial-Surveying/stochLAB/issues)\n\n## Input parameters\n\nMany of the input parameters for the `stoch_crm()` function need to be\nobtained from developers (e.g., blade pitch, rotor radius, wind speed,\netc\xe2\x80\xa6). However, there are many parameters around the morphology and\nbiology of birds that are built into the `sCRM` package for UK seabirds,\nwhich can be found [HERE](https://github.com/dmpstats/sCRM). `sCRM` is\nan R Shiny application that wraps up the `stoch_crm()` and `band_crm()`\nfunctions. These biological parameters can be accessed by installing the\n`sCRM` package and running `sCRM::spp_dflts`, which will bring up a\ntibble object with all the relevant information.\n\nIf performing a collision risk assessment in UK waters, default\nbiological data for the following parameters can be obtained from the\n`sCRM` package:\n\n`flt_speed_pars, body_lt_pars, wing_span_pars, avoid_bsc_pars, avoid_ext_pars, noct_act_pars, flight_type, gen_fhd_boots`\n\nOther parameters around the species of interest need to be derived from\nsite-based surveys:\n\n`prop_crh_pars, bird_dens_dt,` and `site_fhd_boots`\n\nAll wind farm parameters need to be obtained by the wind farm\ndevelopers:\n\n`n_blades, air_gap_pars, rtr_radius_pars, bld_width_pars, bld_pitch_pars, rtn_speed_pars, windspd_pars, trb_wind_avbl, trb_downtime_pars, wf_n_trbs, wf_width, wf_latitude, tidal_offset, season_specs, bld_chord_prf, lrg_arr_corr`\n\nThe following parameters refer to the outputs:\n\n`out_format, out_sampled_pars, out_period, verbose, log_file`\n\nMore information on input parameter specifics can be found in the\nvignettes for `stoch_crm` and `band_crm`.\n\n## Outputs\n\nOnce the collision risk model is run, the key outputs are presented as a\ntable which contains the mean, standard deviation and median number of\ncollisions summarised by month, season, or year. Quantiles of the\nbootstrapped collisions are also presented in the tables. These tables\nare accessed through calling from the model object. Run the\n[Examples](#examples) to view exemplar outputs.\n\n``` r\nstochOUT <- stochLAB::stoch_crm(...)\n\nstochOUT$collisions$opt1 #For outputs from option 1 of the stochastic collision risk model \nstochOUT$collisions$opt2 #For outputs from option 2 of the stochastic collision risk model \nstochOUT$collisions$opt3 #For outputs from option 3 of the stochastic collision risk model \n```\n\n## Examples\n\n### Simple example\n\nThis is a basic example of running the stochastic collision model for\none seabird species and one turbine/wind-farm scenario, with fictional\ninput parameter data.\n\n``` r\nlibrary(stochLAB)\n\n# ------------------------------------------------------\n# Setting some of the required inputs upfront\n\nb_dens <- data.frame(\n month = month.abb,\n mean = runif(12, 0.8, 1.5),\n sd = runif(12, 0.2, 0.3))\n\n# Generic FHD bootstraps for one species, from Johnson et al (2014)\nfhd_boots <- generic_fhd_bootstraps[[1]]\n\n# wind speed vs rotation speed vs pitch\nwind_rtn_ptch <- data.frame(\n wind_speed = seq_len(30),\n rtn_speed = 10/(30:1),\n bld_pitch = c(rep(90, 4), rep(0, 8), 5:22))\n\n# wind availability\nwindavb <- data.frame(\n month = month.abb,\n pctg = runif(12, 85, 98))\n\n# maintenance downtime\ndwntm <- data.frame(\n month = month.abb,\n mean = runif(12, 6, 10),\n sd = rep(2, 12))\n\n# seasons specification\nseas_dt <- data.frame(\n season_id = c(""a"", ""b"", ""c""),\n start_month = c(""Jan"", ""May"", ""Oct""), end_month = c(""Apr"", ""Sep"", ""Dec""))\n\n# ----------------------------------------------------------\n# Run stochastic CRM, treating rotor radius, air gap and\n# blade width as fixed parameters (i.e. not stochastic)\n\nstoch_crm(\n model_options = c(1, 2, 3),\n n_iter = 1000,\n flt_speed_pars = data.frame(mean = 7.26, sd = 1.5),\n body_lt_pars = data.frame(mean = 0.39, sd = 0.005),\n wing_span_pars = data.frame(mean = 1.08, sd = 0.04),\n avoid_bsc_pars = data.frame(mean = 0.99, sd = 0.001),\n avoid_ext_pars = data.frame(mean = 0.96, sd = 0.002),\n noct_act_pars = data.frame(mean = 0.033, sd = 0.005),\n prop_crh_pars = data.frame(mean = 0.06, sd = 0.009),\n bird_dens_opt = ""tnorm"",\n bird_dens_dt = b_dens,\n flight_type = ""flapping"",\n prop_upwind = 0.5,\n gen_fhd_boots = fhd_boots,\n n_blades = 3,\n rtr_radius_pars = data.frame(mean = 80, sd = 0), # sd = 0, rotor radius is fixed\n air_gap_pars = data.frame(mean = 36, sd = 0), # sd = 0, air gap is fixed\n bld_width_pars = data.frame(mean = 8, sd = 0), # sd = 0, blade width is fixed\n rtn_pitch_opt = ""windSpeedReltn"",\n windspd_pars = data.frame(mean = 7.74, sd = 3),\n rtn_pitch_windspd_dt = wind_rtn_ptch,\n trb_wind_avbl = windavb,\n trb_downtime_pars = dwntm,\n wf_n_trbs = 200,\n wf_width = 15,\n wf_latitude = 56.9,\n tidal_offset = 2.5,\n lrg_arr_corr = TRUE,\n verbose = TRUE,\n seed = 1234,\n out_format = ""summaries"",\n out_sampled_pars = TRUE,\n out_period = ""seasons"",\n season_specs = seas_dt,\n log_file = paste0(getwd(), ""scrm_example.log"")\n)\n#> \n#> \xe2\x94\x80\xe2\x94\x80 Stochastic CRM \xe2\x94\x80\xe2\x94\x80\n#> \n#> \xe2\x84\xb9 Checking inputs\xe2\x9c\x94 Checking inputs [87ms]\n#> \xe2\x84\xb9 Preparing data\xe2\x9c\x94 Preparing data [152ms]\n#> \xe2\x84\xb9 Sampling parameters\xe2\x9c\x94 Sampling parameters [423ms]\n#> \xe2\xa0\x99 Calculating collisions | 3/1000 iterations\xe2\xa0\xb9 Calculating collisions | 117/1000 iterations\xe2\xa0\xb8 Calculating collisions | 272/1000 iterations\xe2\xa0\xbc Calculating collisions | 420/1000 iterations\xe2\xa0\xb4 Calculating collisions | 570/1000 iterations\xe2\xa0\xa6 Calculating collisions | 717/1000 iterations\xe2\xa0\xa7 Calculating collisions | 870/1000 iterations\xe2\xa0\x87 Calculating collisions | 1000/1000 iterations\xe2\x9c\x94 Calculating collisions | 1000/1000 iterations [1.6s]\n#> \xe2\x84\xb9 Sorting outputs\xe2\x9c\x94 Sorting outputs [761ms]\n#> \xe2\x9c\x94 Job done!\n#> $collisions\n#> $collisions$opt1\n#> # A tibble: 3 \xc3\x97 10\n#> season_id period mean sd median pctl_2.5 pctl_25 pctl_75 pctl_9\xe2\x80\xa6\xc2\xb9 pctl_99\n#> \n#> 1 a Jan_Apr 28.0 12.7 28.8 7.15 18.2 36.4 52.0 67.3\n#> 2 b May_Sep 56.4 25.3 58.1 15.4 37.2 73.5 105. 142. \n#> 3 c Oct_Dec 19.3 8.92 19.5 5.13 12.2 25.5 36.6 45.4\n#> # \xe2\x80\xa6 with abbreviated variable name \xc2\xb9\xe2\x80\x8bpctl_97.5\n#> \n#> $collisions$opt2\n#> # A tibble: 3 \xc3\x97 10\n#> season_id period mean sd median pctl_2.5 pctl_25 pctl_75 pctl_9\xe2\x80\xa6\xc2\xb9 pctl_99\n#> \n#> 1 a Jan_Apr 0.728 1.28 0.393 0.0676 0.230 0.655 5.05 11.6 \n#> 2 b May_Sep 1.47 2.64 0.804 0.135 0.469 1.30 10.8 21.8 \n#> 3 c Oct_Dec 0.499 0.888 0.274 0.0448 0.158 0.445 3.43 8.06\n#> # \xe2\x80\xa6 with abbreviated variable name \xc2\xb9\xe2\x80\x8bpctl_97.5\n#> \n#> $collisions$opt3\n#> # A tibble: 3 \xc3\x97 10\n#> season_id period mean sd median pctl_2.5 pctl_25 pctl_75 pctl_9\xe2\x80\xa6\xc2\xb9 pctl_99\n#> \n#> 1 a Jan_Apr 0.350 0.790 0.143 0.0337 0.0913 0.256 3.05 6.66\n#> 2 b May_Sep 0.710 1.61 0.294 0.0687 0.185 0.487 5.55 15.2 \n#> 3 c Oct_Dec 0.240 0.546 0.0995 0.0223 0.0636 0.177 1.93 4.73\n#> # \xe2\x80\xa6 with abbreviated variable name \xc2\xb9\xe2\x80\x8bpctl_97.5\n#> \n#> \n#> $sampled_pars\n#> $sampled_pars$air_gap\n#> # A tibble: 1 \xc3\x97 5\n#> mean sd median pctl_2.5 pctl_97.5\n#> \n#> 1 36 0 36 36 36\n#> \n#> $sampled_pars$bld_width\n#> # A tibble: 1 \xc3\x97 5\n#> mean sd median pctl_2.5 pctl_97.5\n#> \n#> 1 8 0 8 8 8\n#> \n#> $sampled_pars$body_lt\n#> # A tibble: 1 \xc3\x97 5\n#> mean sd median pctl_2.5 pctl_97.5\n#> \n#> 1 0.390 0.00499 0.390 0.380 0.400\n#> \n#> $sampled_pars$flt_speed\n#> # A tibble: 1 \xc3\x97 5\n#> mean sd median pctl_2.5 pctl_97.5\n#> \n#> 1 7.28 1.47 7.28 4.30 10.0\n#> \n#> $sampled_pars$noct_actv\n#> # A tibble: 1 \xc3\x97 5\n#> mean sd median pctl_2.5 pctl_97.5\n#> \n#> 1 0.0333 0.00498 0.0333 0.0241 0.0436\n#> \n#> $sampled_pars$rtr_radius\n#> # A tibble: 1 \xc3\x97 5\n#> mean sd median pctl_2.5 pctl_97.5\n#> \n#> 1 80 0 80 80 80\n#> \n#> $sampled_pars$wing_span\n#> # A tibble: 1 \xc3\x97 5\n#> mean sd median pctl_2.5 pctl_97.5\n#> \n#> 1 1.08 0.0398 1.08 1.00 1.16\n#> \n#> $sampled_pars$hub_height\n#> # A tibble: 1 \xc3\x97 5\n#> mean sd median pctl_2.5 pctl_97.5\n#> \n#> 1 116 0 116 116 116\n#> \n#> $sampled_pars$dens_mth\n#> # A tibble: 12 \xc3\x97 6\n#> period mean sd median pctl_2.5 pctl_97.5\n#> \n#> 1 Jan 0.849 0.286 0.847 0.315 1.42\n#> 2 Feb 0.833 0.224 0.826 0.415 1.28\n#> 3 Mar 1.12 0.225 1.12 0.665 1.53\n#> 4 Apr 1.11 0.242 1.11 0.647 1.59\n#> 5 May 1.22 0.270 1.21 0.700 1.75\n#> 6 Jun 0.835 0.252 0.838 0.299 1.35\n#> 7 Jul 1.07 0.247 1.07 0.600 1.55\n#> 8 Aug 1.04 0.279 1.05 0.490 1.61\n#> 9 Sep 1.13 0.277 1.13 0.601 1.66\n#> 10 Oct 1.04 0.222 1.04 0.616 1.48\n#> 11 Nov 0.965 0.267 0.959 0.460 1.50\n#> 12 Dec 1.21 0.199 1.21 0.813 1.59\n#> \n#> $sampled_pars$prop_oper_mth\n#> # A tibble: 12 \xc3\x97 6\n#> period mean sd median pctl_2.5 pctl_97.5\n#> \n#> 1 Jan 0.793 0.0207 0.792 0.752 0.829\n#> 2 Feb 0.769 0.0207 0.769 0.729 0.808\n#> 3 Mar 0.850 0.0204 0.850 0.807 0.890\n#> 4 Apr 0.807 0.0196 0.807 0.769 0.845\n#> 5 May 0.847 0.0195 0.847 0.809 0.884\n#> 6 Jun 0.866 0.0198 0.866 0.827 0.905\n#> 7 Jul 0.869 0.0199 0.869 0.829 0.910\n#> 8 Aug 0.806 0.0194 0.806 0.768 0.844\n#> 9 Sep 0.879 0.0205 0.879 0.838 0.917\n#> 10 Oct 0.820 0.0200 0.820 0.779 0.858\n#> 11 Nov 0.858 0.0201 0.858 0.819 0.898\n#> 12 Dec 0.910 0.0201 0.911 0.872 0.947\n#> \n#> $sampled_pars$downtime\n#> # A tibble: 12 \xc3\x97 6\n#> period mean sd median pctl_2.5 pctl_97.5\n#> \n#> 1 Jan 6.09 2.07 6.12 2.39 10.1\n#> 2 Feb 8.98 2.07 8.99 5.12 12.9\n#> 3 Mar 8.54 2.04 8.54 4.52 12.8\n#> 4 Apr 8.57 1.96 8.61 4.79 12.4\n#> 5 May 7.83 1.95 7.81 4.12 11.7\n#> 6 Jun 7.64 1.98 7.61 3.78 11.6\n#> 7 Jul 9.43 1.99 9.42 5.37 13.5\n#> 8 Aug 6.35 1.94 6.31 2.55 10.1\n#> 9 Sep 7.46 2.05 7.44 3.62 11.6\n#> 10 Oct 6.08 2.00 6.10 2.28 10.1\n#> 11 Nov 7.56 2.01 7.54 3.58 11.4\n#> 12 Dec 7.03 2.01 6.94 3.32 10.8\n#> \n#> $sampled_pars$wind_speed\n#> # A tibble: 1 \xc3\x97 5\n#> mean sd median pctl_2.5 pctl_97.5\n#> \n#> 1 7.73 3.00 7.73 2.15 13.7\n#> \n#> $sampled_pars$rtn_speed\n#> # A tibble: 1 \xc3\x97 5\n#> mean sd median pctl_2.5 pctl_97.5\n#> \n#> 1 0.428 0.0574 0.417 0.345 0.556\n#> \n#> $sampled_pars$bld_pitch\n#> # A tibble: 1 \xc3\x97 5\n#> mean sd median pctl_2.5 pctl_97.5\n#> \n#> 1 0.329 0.635 0 0 1.57\n#> \n#> $sampled_pars$avoid_bsc\n#> # A tibble: 1 \xc3\x97 5\n#> mean sd median pctl_2.5 pctl_97.5\n#> \n#> 1 0.990 0.000985 0.990 0.988 0.992\n#> \n#> $sampled_pars$avoid_ext\n#> # A tibble: 1 \xc3\x97 5\n#> mean sd median pctl_2.5 pctl_97.5\n#> \n#> 1 0.960 0.00203 0.960 0.956 0.964\n#> \n#> $sampled_pars$prop_crh\n#> # A tibble: 1 \xc3\x97 5\n#> mean sd median pctl_2.5 pctl_97.5\n#> \n#> 1 0.0605 0.00904 0.0601 0.0441 0.0800\n#> \n#> $sampled_pars$gen_fhd\n#> # A tibble: 500 \xc3\x97 6\n#> height mean sd median pctl_2.5 pctl_97.5\n#>