📜  ais 数据集成哨兵 SAR 数据 github - Shell-Bash (1)

📅  最后修改于: 2023-12-03 15:29:18.603000             🧑  作者: Mango

AIS数据集成哨兵SAR数据

简介

AIS(Automatic Identification System)为船舶自动识别系统,是一个广泛应用于海洋领域的卫星监测技术,主要用于监测海上航运安全、海洋环境保护、渔业资源管理等。而SAR(Synthetic Aperture Radar)则是一种通过合成大幅面积的雷达图像来实现高分辨率成像的雷达技术。集成AIS和SAR数据可以更准确地监测船舶的位置和运动轨迹。在此,我们提供一种Shell-Bash程序来实现AIS数据的集成和哨兵SAR数据的下载和处理。

程序流程
  1. 安装和配置所需环境
  2. 从AISHub下载AIS数据
  3. 从哨兵数据Hub下载SAR数据
  4. 使用Python处理AIS和SAR数据并集成
  5. 可视化结果
安装和配置所需环境

要使用此程序,必须先安装以下环境:

然后,要使用s2s Python包必须先从GitHub下载源码并进行安装:

git clone https://github.com/SatelliteApplicationsCatapult/sentinel-to-safety
cd sentinel-to-safety
pip install .
从AISHub下载AIS数据

要从AIS Hub下载AIS数据,我们首先需要注册并获得userIDlicenseKey。将这些值放入download-ais-data.bash脚本中,然后运行此脚本即可下载AIS数据。

#!/usr/bin/env bash

# User ID from AIS Hub
USER_ID='your_user_id'
# License key from AIS Hub
LICENSE_KEY='your_license_key'

echo 'Download AIS data from AIS Hub...'
docker run --rm -v "$(pwd)":/mnt/data kognise/wrk \
    https://www.aishub.net/services/feeds/ \
    -H "User-Agent: AIShubAPI/1" \
    -H "User-ID: $USER_ID" \
    -H "License-Key: $LICENSE_KEY" \
    --output /mnt/data/ais.json
从哨兵数据Hub下载SAR数据

若要从哨兵数据Hub下载SAR数据,必须先注册并获得访问权限。将访问凭证放入download-sar-data.bash脚本并指定时间范围和地理位置,然后运行此脚本即可下载SAR数据。请注意,此脚本要求下载的开始时间和结束时间格式为:YYYY-MM-DDThh:mm:ss.sssZ

#!/usr/bin/env bash

# Sentinel Hub credentials
SENTINELHUB_URL='https://scihub.copernicus.eu/apihub/'
SENTINELHUB_USERNAME='your_username'
SENTINELHUB_PASSWORD='your_password'

# Query parameters
AREA='41.86,12.4,43.14,13.96'
START_DATE='2021-08-01T00:00:00.000Z'
END_DATE='2021-08-31T23:59:59.999Z'

echo 'Download SAR data from Sentinel Hub...'
s2s download \
    --sentinelhub-url $SENTINELHUB_URL \
    --sentinelhub-username $SENTINELHUB_USERNAME \
    --sentinelhub-password $SENTINELHUB_PASSWORD \
    --output-dir /mnt/data \
    --area $AREA \
    --start-date $START_DATE \
    --end-date $END_DATE \
    --product-type GRD \
    --polarizations VV \
    --orbit-direction DESCENDING \
    --max-cloud-coverage 100 \
    --tile-size 2048
使用Python处理AIS和SAR数据并集成

现在我们已经有了从AIS Hub和哨兵数据Hub下载的数据。我们需要一个Python程序来集成这些数据以便后续可视化。将以下代码保存为merge-ais-sar.py文件。

import json
import datetime
import math
import logging
import argparse

from osgeo import gdal
import matplotlib.pyplot as plt

# Set log level to INFO
logging.basicConfig(format='%(levelname)s: %(message)s', level=logging.INFO)

def parse_args():
    """Parse command-line arguments"""
    parser = argparse.ArgumentParser(description='Merge AIS and SAR data')
    parser.add_argument('--ais', type=str, required=True,
                        help='Path to the AIS JSON data')
    parser.add_argument('--sar', type=str, required=True,
                        help='Path to the Sentinel-1 data')
    parser.add_argument('--output', type=str, required=True,
                        help='Output file path for processed data')

    return parser.parse_args()

def parse_ais(data):
    """Parse AIS data"""
    ships = []
    for ship in data['features']:
        try:
            time = datetime.datetime.strptime(ship['properties']['BaseDateTime'], '%Y-%m-%dT%H:%M:%S.%fZ')
            lat = float(ship['geometry']['coordinates'][1])
            lon = float(ship['geometry']['coordinates'][0])

            if math.isnan(lat) or math.isnan(lon):
                continue

            ships.append({
                'time': time,
                'lat': lat,
                'lon': lon
            })
        except:
            logging.warning('Failed to parse AIS data for ship: %s' % ship['properties']['Name'])

    return ships

def parse_sar(filename):
    """Parse SAR data"""
    ds = gdal.Open(filename)
    transform = ds.GetGeoTransform()
    x_size = ds.RasterXSize
    y_size = ds.RasterYSize
    data = ds.GetRasterBand(1).ReadAsArray()

    return {
        'min_lat': transform[3] + y_size * transform[5],
        'max_lat': transform[3],
        'min_lon': transform[0],
        'max_lon': transform[0] + x_size * transform[1],
        'data': data,
        'transform': transform
    }

def merge(ais_data, sar_data):
    """Merge AIS and SAR data"""
    # APSIS algorithm - Adaptive Piecewise Skeletonization Image Segmentation
    # Algorithm reference: https://ieeexplore.ieee.org/abstract/document/4359717
    # Code reference: https://github.com/mrademaker/apsis
    def apsis(sar_data):
        from skimage import img_as_ubyte
        from skimage.segmentation import slic
        from skimage.segmentation import mark_boundaries
        from skimage import color
        from skimage import filters

        data = img_as_ubyte(sar_data['data'])

        # Compute the median value of the data as threshold
        thresh = filters.threshold_otsu(data)
        binary = data > thresh

        # Create the SLIC superpixels
        segments = slic(binary, compactness=5, n_segments=3000)

        # Compute the average brightness in each segment
        brightness = color.rgb2gray(mark_boundaries(data, segments))

        # Compute the lowest brightness value in each region
        region_brightness = {}
        for i in range(segments.max() + 1):
            region_brightness[i] = float('inf')

        for i in range(segments.shape[0]):
            for j in range(segments.shape[1]):
                if brightness[i, j] < region_brightness[segments[i, j]]:
                    region_brightness[segments[i, j]] = brightness[i, j]

        region_brightness = sorted(region_brightness.items(), key=lambda x: x[1])

        segments_to_keep = set()
        for i in range(len(region_brightness)):
            segment = region_brightness[i][0]
            segments_to_keep.add(segment)
            neighbors = set(segments[(segments == segment - 1) | (segments == segment + 1)])
            neighbors = neighbors.union(set(segments[(segments[:, :-1] == segment) | (segments[1:, :] == segment)]))
            neighbors = neighbors.union(set(segments[(segments[:, 1:] == segment) | (segments[:-1, :] == segment)]))

            for neighbor in neighbors:
                if neighbor in segments_to_keep:
                    break
            else:
                continue

            segments_to_keep.add(segment)

        binary_apsis = segments_to_keep.intersection(set(segments.flatten()))

        return {
            'min_lat': sar_data['min_lat'],
            'max_lat': sar_data['max_lat'],
            'min_lon': sar_data['min_lon'],
            'max_lon': sar_data['max_lon'],
            'data': binary_apsis,
            'transform': sar_data['transform']
        }

    apsis_data = apsis(sar_data)
    plt.imshow(apsis_data['data'], cmap='gray', extent=[apsis_data['min_lon'], apsis_data['max_lon'], apsis_data['min_lat'], apsis_data['max_lat']])

    # Plot AIS data on top of SAR data
    for ship in ais_data:
        plt.plot(ship['lon'], ship['lat'], 'ro')

    plt.savefig('ais-sar.png')

    with open('ais-sar.geojson', 'w') as f:
        f.write('{"type": "FeatureCollection", "features": [')

        for i, ship in enumerate(ais_data):
            if i == 0:
                f.write('\n')
            else:
                f.write(',\n')

            f.write('{"type": "Feature", "geometry": {"type": "Point", "coordinates": [%f, %f]}, "properties": {"time": "%s"}}' % (ship['lon'], ship['lat'], ship['time'].isoformat()))

        f.write(']}')

    logging.info('Merged AIS and SAR data saved to ais-sar.png and ais-sar.geojson')

if __name__ == '__main__':
    args = parse_args()

    logging.info('Parsing AIS data...')
    with open(args.ais, 'r') as f:
        ais_data = parse_ais(json.load(f))

    logging.info('Parsing SAR data...')
    sar_data = parse_sar(args.sar)

    logging.info('Merging AIS and SAR data...')
    merge(ais_data, sar_data)

    logging.info('Finished')

现在以以下方式运行脚本即可。

python merge-ais-sar.py --ais ais.json --sar sentinel-1/20191201T055704_20191201T055809_T33SVF_VV_grd_mli_geo_norm.tif --output ais-sar.geojson
可视化结果

最后,使用地图可视化结果。这里我们选择了https://geojson.io/网站。将刚刚生成的ais-sar.geojson文件上传到该网站,即可以地图形式查看结果。

Map Visualization

结论

如此,我们就以Shell-Bash程序实现了AIS数据的集成和哨兵SAR数据的下载和处理。通过运行Python代码将这两个数据集集成起来,并在地图上可视化结果,我们可以更准确地了解船舶位置和运动轨迹。